Rename Lossless/Lossy to Hyperview/View #6

Merged
lentil merged 4 commits from chore/rename-as-hyperview into main 2025-07-09 14:37:37 -05:00
68 changed files with 862 additions and 746 deletions
Showing only changes of commit 50ac2b7b35 - Show all commits

View File

@ -42,7 +42,7 @@ npm run example-app
- DeltaV2 is the current format (DeltaV1 is legacy)
2. **Views**: Different ways to interpret the delta stream:
- **Lossless View**: Stores all deltas without conflict resolution
- **Hyperview View**: Stores all deltas without conflict resolution
- **Lossy Views**: Apply conflict resolution (e.g., Last-Write-Wins)
- Custom resolvers can be implemented
@ -60,7 +60,7 @@ npm run example-app
- `src/node.ts`: Main `RhizomeNode` class orchestrating all components
- `src/delta.ts`: Delta data structures and conversion logic
- `src/lossless.ts`: Core lossless view implementation
- `src/hyperview.ts`: Core hyperview view implementation
- `src/collection-basic.ts`: Basic collection implementation
- `src/http/api.ts`: REST API endpoints
- `src/pub-sub.ts`: Network communication layer
@ -78,7 +78,7 @@ The HTTP API provides RESTful endpoints:
- `GET/PUT /collection/:name/:id` - Entity operations
- `GET /peers` - Peer information
- `GET /deltas/stats` - Delta statistics
- `GET /lossless/:entityId` - Raw delta access
- `GET /hyperview/:entityId` - Raw delta access
### Important Implementation Notes

View File

@ -150,13 +150,13 @@ curl -s -X PUT -H 'content-type:application/json' -d @/tmp/user.json http://loca
| Peering | Yes | Implemented with ZeroMQ and/or Libp2p. Libp2p solves more problems. |
| Schemas | Not really | Currently very thin layer allowing TypedCollections |
| Relationships | No | Supporting relational algebra among domain entities |
| Views | Yes | Lossless: Map the `targetContext`s as properties of domain entities. |
| Views | Yes | Hyperview: Map the `targetContext`s as properties of domain entities. |
| | | Lossy: Use a delta filter and a resolver function to produce a view. |
| | | Currently using functions rather than JSON-Logic expressions. |
| Functions | No | Arbitrary subscribers to delta stream (that can also emit deltas?) |
| Tests | Yes | We are set up to run unit tests and multi-node tests |
| Identity | Sort of | We have an identity service via Libp2p |
| Contexts | No | Each context may involve different lossy functions and delta filters |
| Contexts | No | Each context may involve different view functions and delta filters |
| HTTP API | Yes | Basic peering info and entity CRUD |
If we express views and filter rules as JSON-Logic, we can easily include them in records.

View File

@ -90,7 +90,7 @@ graph TD
- Design storage layer (Mnesia/ETS)
- **View System**
- Implement Lossy/Lossless views
- Implement Lossy/Hyperview views
- Create resolver framework
- Add caching layer
@ -237,7 +237,7 @@ sequenceDiagram
- [ ] Define delta types and operations
- [ ] Implement DeltaBuilder
- [ ] Basic storage with Mnesia/ETS
- [ ] View system with Lossy/Lossless support
- [ ] View system with Lossy/Hyperview support
### 2. Distributed Foundation
- [ ] Node discovery and membership

View File

@ -2,10 +2,10 @@
- [x] Organize tests?
- [x] More documentation in docs/
- [ ] Rename/consolidate, lossless view() and compose() --> composeView()
- [ ] Rename Lossless to HyperView
- [ ] Rename Lossy to View
- [ ] Consider whether we should use collapsed deltas
- [x] Rename/consolidate, hyperview view() and compose() --> composeView()
- [x] Rename Hyperview to HyperView
- [x] Rename Lossy to View
- [x] Consider whether we should use collapsed deltas
- [ ] Improve ergonomics of declaring multiple entity properties in one delta
- [x] Use dotenv so we can more easily manage the local dev test environment
- [ ] Create test helpers to reduce boilerplate
- [x] Create test helpers to reduce boilerplate

View File

@ -1,5 +1,5 @@
# Test structure
- before test, initialize node and lossless view
- before test, initialize node and hyperview view
- when test begins, create and ingest a series of deltas
- instantiate a resolver, in this case using custom resolver plugins
- call the resolver's initializer with the view

View File

@ -1,5 +1,5 @@
import { RhizomeNode } from '@src';
import { Lossless } from '@src/views/lossless';
import { Hyperview } from '@src/views/hyperview';
import { Delta } from '@src/core/delta';
import { createDelta } from '@src/core/delta-builder';
import { CustomResolver } from '@src/views/resolvers/custom-resolvers';
@ -29,12 +29,12 @@ export async function testResolverWithPlugins<T extends TestPluginMap>(
// Setup test environment
const node = new RhizomeNode();
const lossless = new Lossless(node);
const view = new CustomResolver(lossless, plugins);
const hyperview = new Hyperview(node);
const view = new CustomResolver(hyperview, plugins);
// Ingest all deltas through the lossless instance
// Ingest all deltas through the hyperview instance
for (const delta of deltas) {
lossless.ingestDelta(delta);
hyperview.ingestDelta(delta);
}
// Get the resolved view

View File

@ -1,4 +1,4 @@
import { LosslessViewOne } from '@src/views/lossless';
import { HyperviewViewOne } from '@src/views/hyperview';
import {
SchemaBuilder,
PrimitiveSchemas,
@ -153,12 +153,12 @@ describe('Schema System', () => {
expect(schemaRegistry.hasCircularDependencies()).toBe(true);
});
test('should validate lossless views against schemas', () => {
test('should validate hyperview views against schemas', () => {
const userSchema = CommonSchemas.User();
schemaRegistry.register(userSchema);
// Create a valid lossless view
const validView: LosslessViewOne = {
// Create a valid hyperview view
const validView: HyperviewViewOne = {
id: 'user123',
propertyDeltas: {
name: [
@ -179,7 +179,7 @@ describe('Schema System', () => {
expect(result.errors).toHaveLength(0);
// Test invalid view (missing required property)
const invalidView: LosslessViewOne = {
const invalidView: HyperviewViewOne = {
id: 'user456',
propertyDeltas: {
age: [
@ -212,7 +212,7 @@ describe('Schema System', () => {
schemaRegistry.register(schema);
// Valid types
const validView: LosslessViewOne = {
const validView: HyperviewViewOne = {
id: 'test1',
propertyDeltas: {
stringProp: [
@ -240,7 +240,7 @@ describe('Schema System', () => {
expect(validResult.valid).toBe(true);
// Invalid types
const invalidView: LosslessViewOne = {
const invalidView: HyperviewViewOne = {
id: 'test2',
propertyDeltas: {
stringProp: [
@ -331,7 +331,7 @@ describe('Schema System', () => {
.addPointer('users', 'user3', 'email')
.addPointer('email', 'invalid@test.com')
.buildV1();
node.lossless.ingestDelta(invalidDelta);
node.hyperview.ingestDelta(invalidDelta);
const stats = collection.getValidationStats();
expect(stats.totalEntities).toBe(3);
@ -355,11 +355,11 @@ describe('Schema System', () => {
const invalidDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty('user3', 'age', 'not-a-number', 'users')
.buildV1();
node.lossless.ingestDelta(invalidDelta);
node.hyperview.ingestDelta(invalidDelta);
debug(`Manually ingested invalid delta: ${JSON.stringify(invalidDelta)}`)
debug(`Lossless view: ${JSON.stringify(node.lossless.compose(), null, 2)}`)
debug(`Hyperview view: ${JSON.stringify(node.hyperview.compose(), null, 2)}`)
const validIds = collection.getValidEntities();
expect(validIds).toContain('user1');
@ -371,7 +371,7 @@ describe('Schema System', () => {
expect(invalidEntities[0].entityId).toBe('user3');
});
test('should apply schema to lossless views', async () => {
test('should apply schema to hyperview views', async () => {
const userSchema = CommonSchemas.User();
const collection = new TypedCollectionImpl<{
name: string;

View File

@ -1,7 +1,7 @@
import { createDelta } from '@src/core/delta-builder';
import {
RhizomeNode,
Lossless,
Hyperview,
SumResolver,
CustomResolver,
LastWriteWinsPlugin,
@ -13,28 +13,28 @@ const debug = Debug('rz:test:performance');
describe('Concurrent Write Scenarios', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
describe('Simultaneous Writes with Same Timestamp', () => {
test('should handle simultaneous writes using last-write-wins resolver', () => {
const resolver = new TimestampResolver(lossless);
const resolver = new TimestampResolver(hyperview);
const timestamp = 1000;
// Simulate two writers updating the same property at the exact same time
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withId('delta-a')
.withTimestamp(timestamp)
.setProperty('entity1', 'score', 100, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('writer2', 'host2')
hyperview.ingestDelta(createDelta('writer2', 'host2')
.withId('delta-b')
.withTimestamp(timestamp) // Same timestamp
.setProperty('entity1', 'score', 200, 'collection')
@ -52,16 +52,16 @@ describe('Concurrent Write Scenarios', () => {
test('should handle simultaneous writes using timestamp resolver with tie-breaking', () => {
const timestamp = 1000;
const resolver = new TimestampResolver(lossless, 'creator-id');
const resolver = new TimestampResolver(hyperview, 'creator-id');
lossless.ingestDelta(createDelta('writer_z', 'host1') // Lexicographically later
hyperview.ingestDelta(createDelta('writer_z', 'host1') // Lexicographically later
.withId('delta-a')
.withTimestamp(timestamp)
.setProperty('entity1', 'score', 100, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('writer_a', 'host2') // Lexicographically earlier
hyperview.ingestDelta(createDelta('writer_a', 'host2') // Lexicographically earlier
.withId('delta-b')
.withTimestamp(timestamp) // Same timestamp
.setProperty('entity1', 'score', 200, 'collection')
@ -76,23 +76,23 @@ describe('Concurrent Write Scenarios', () => {
});
test('should handle multiple writers with aggregation resolver', () => {
const resolver = new SumResolver(lossless, ['points']);
const resolver = new SumResolver(hyperview, ['points']);
// Multiple writers add values simultaneously
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'points', 10, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('writer2', 'host2')
hyperview.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(1000) // Same timestamp
.setProperty('entity1', 'points', 20, 'collection')
.buildV1()
);
// Third writer adds another value
lossless.ingestDelta(createDelta('writer3', 'host3')
hyperview.ingestDelta(createDelta('writer3', 'host3')
.withTimestamp(1000) // Same timestamp
.setProperty('entity1', 'points', 30, 'collection')
.buildV1()
@ -108,10 +108,10 @@ describe('Concurrent Write Scenarios', () => {
describe('Out-of-Order Write Arrival', () => {
test('should handle writes arriving out of chronological order', () => {
const resolver = new TimestampResolver(lossless);
const resolver = new TimestampResolver(hyperview);
// Newer delta arrives first
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(2000)
.addPointer('collection', 'entity1', 'value')
.addPointer('value', 'newer')
@ -119,7 +119,7 @@ describe('Concurrent Write Scenarios', () => {
);
// Older delta arrives later
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'value')
.addPointer('value', 'older')
@ -134,24 +134,24 @@ describe('Concurrent Write Scenarios', () => {
});
test('should maintain correct aggregation despite out-of-order arrival', () => {
const resolver = new SumResolver(lossless, ['score']);
const resolver = new SumResolver(hyperview, ['score']);
// Add deltas in reverse chronological order
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(3000)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 30)
.buildV1()
);
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 10)
.buildV1()
);
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(2000)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 20)
@ -168,7 +168,7 @@ describe('Concurrent Write Scenarios', () => {
describe('High-Frequency Concurrent Updates', () => {
test('should handle rapid concurrent updates to the same entity', () => {
const resolver = new SumResolver(lossless, ['counter']);
const resolver = new SumResolver(hyperview, ['counter']);
const baseTimestamp = 1000;
const numWriters = 10;
@ -177,7 +177,7 @@ describe('Concurrent Write Scenarios', () => {
// Simulate multiple writers making rapid updates
for (let writer = 0; writer < numWriters; writer++) {
for (let write = 0; write < writesPerWriter; write++) {
lossless.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
hyperview.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
.withTimestamp(baseTimestamp + write)
.addPointer('collection', 'entity1', 'counter')
.addPointer('counter', 1)
@ -194,7 +194,7 @@ describe('Concurrent Write Scenarios', () => {
});
test('should handle concurrent updates to multiple properties', () => {
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
name: new LastWriteWinsPlugin(),
score: new LastWriteWinsPlugin()
});
@ -202,14 +202,14 @@ describe('Concurrent Write Scenarios', () => {
const timestamp = 1000;
// Writer 1 updates name and score
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'name')
.addPointer('name', 'alice')
.buildV1()
);
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp + 1)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 100)
@ -217,14 +217,14 @@ describe('Concurrent Write Scenarios', () => {
);
// Writer 2 updates name and score concurrently
lossless.ingestDelta(createDelta('writer2', 'host2')
hyperview.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp + 2)
.addPointer('collection', 'entity1', 'name')
.addPointer('name', 'bob')
.buildV1()
);
lossless.ingestDelta(createDelta('writer2', 'host2')
hyperview.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp + 3)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 200)
@ -241,13 +241,13 @@ describe('Concurrent Write Scenarios', () => {
describe('Cross-Entity Concurrent Writes', () => {
test('should handle concurrent writes to different entities', () => {
const resolver = new TimestampResolver(lossless);
const resolver = new TimestampResolver(hyperview);
const timestamp = 1000;
// Multiple writers updating different entities simultaneously
for (let i = 0; i < 5; i++) {
lossless.ingestDelta(createDelta(`writer${i}`, `host${i}`)
hyperview.ingestDelta(createDelta(`writer${i}`, `host${i}`)
.withTimestamp(timestamp)
.addPointer('collection', `entity${i}`, 'value')
.addPointer('value', (i + 1) * 10)
@ -266,28 +266,28 @@ describe('Concurrent Write Scenarios', () => {
});
test('should handle mixed entity and property conflicts', () => {
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
votes: new MajorityVotePlugin(),
status: new LastWriteWinsPlugin()
});
const timestamp = 1000;
// Entity1: Multiple writers competing for same property
lossless.ingestDelta(createDelta('writer1', 'host1')
hyperview.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_a')
.buildV1()
);
lossless.ingestDelta(createDelta('writer2', 'host2')
hyperview.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_a')
.buildV1()
);
lossless.ingestDelta(createDelta('writer3', 'host3')
hyperview.ingestDelta(createDelta('writer3', 'host3')
.withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_b')
@ -295,7 +295,7 @@ describe('Concurrent Write Scenarios', () => {
);
// Entity2: Single writer, no conflict
lossless.ingestDelta(createDelta('writer4', 'host4')
hyperview.ingestDelta(createDelta('writer4', 'host4')
.withTimestamp(timestamp)
.addPointer('collection', 'entity2', 'status')
.addPointer('status', 'active')
@ -312,7 +312,7 @@ describe('Concurrent Write Scenarios', () => {
describe('Stress Testing', () => {
test('should handle large number of concurrent writes efficiently', () => {
const resolver = new SumResolver(lossless, ['score']);
const resolver = new SumResolver(hyperview, ['score']);
const numEntities = 100;
const numWritersPerEntity = 10;
@ -321,7 +321,7 @@ describe('Concurrent Write Scenarios', () => {
// Generate a large number of concurrent writes
for (let entity = 0; entity < numEntities; entity++) {
for (let writer = 0; writer < numWritersPerEntity; writer++) {
lossless.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
hyperview.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
.withTimestamp(baseTimestamp + Math.floor(Math.random() * 1000))
.addPointer('collection', `entity${entity}`, 'score')
.addPointer('score', Math.floor(Math.random() * 100))
@ -344,14 +344,14 @@ describe('Concurrent Write Scenarios', () => {
});
test('should maintain consistency under rapid updates and resolution calls', () => {
const resolver = new SumResolver(lossless, ['counter']);
const resolver = new SumResolver(hyperview, ['counter']);
const entityId = 'stress-test-entity';
let updateCount = 0;
// Add initial deltas
for (let i = 0; i < 50; i++) {
lossless.ingestDelta(createDelta(
hyperview.ingestDelta(createDelta(
`writer${i % 5}`,
`host${i % 3}`
)
@ -370,7 +370,7 @@ describe('Concurrent Write Scenarios', () => {
// Add more deltas and verify consistency
for (let i = 0; i < 25; i++) {
lossless.ingestDelta(createDelta('late-writer', 'late-host')
hyperview.ingestDelta(createDelta('late-writer', 'late-host')
.withTimestamp(2000 + i)
.addPointer('collection', entityId, 'counter')
.addPointer('counter', 2)

View File

@ -83,7 +83,7 @@ describe('Nested Object Resolution Performance', () => {
const friendshipDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'friends', friendId, 'users')
.buildV1();
node.lossless.ingestDelta(friendshipDelta);
node.hyperview.ingestDelta(friendshipDelta);
}
}
@ -96,7 +96,7 @@ describe('Nested Object Resolution Performance', () => {
const followDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'followers', followerId, 'users')
.buildV1();
node.lossless.ingestDelta(followDelta);
node.hyperview.ingestDelta(followDelta);
}
}
@ -107,7 +107,7 @@ describe('Nested Object Resolution Performance', () => {
const mentorshipDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'mentor', mentorId, 'users')
.buildV1();
node.lossless.ingestDelta(mentorshipDelta);
node.hyperview.ingestDelta(mentorshipDelta);
}
}
@ -116,7 +116,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution performance for a user with many connections
const testUserId = userIds[50]; // Pick a user in the middle
const userViews = node.lossless.compose([testUserId]);
const userViews = node.hyperview.compose([testUserId]);
const userView = userViews[testUserId];
const startResolution = performance.now();
@ -124,7 +124,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting(
userView,
'network-user',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -197,7 +197,7 @@ describe('Nested Object Resolution Performance', () => {
const linkDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(currentId, 'next', nextId, 'users')
.buildV1();
node.lossless.ingestDelta(linkDelta);
node.hyperview.ingestDelta(linkDelta);
}
const setupTime = performance.now() - startSetup;
@ -205,7 +205,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution from the start of the chain
const firstUserId = userIds[0];
const userViews = node.lossless.compose([firstUserId]);
const userViews = node.hyperview.compose([firstUserId]);
const userView = userViews[firstUserId];
const startResolution = performance.now();
@ -213,7 +213,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting(
userView,
'chain-user',
node.lossless,
node.hyperview,
{ maxDepth: 5 } // Should resolve 5 levels deep
);
@ -292,7 +292,7 @@ describe('Nested Object Resolution Performance', () => {
.addPointer('users', userId, 'connections')
.addPointer('connections', connectedId)
.buildV1();
node.lossless.ingestDelta(connectionDelta);
node.hyperview.ingestDelta(connectionDelta);
}
}
@ -301,7 +301,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution performance with circular references
const testUserId = userIds[0];
const userViews = node.lossless.compose([testUserId]);
const userViews = node.hyperview.compose([testUserId]);
const userView = userViews[testUserId];
const startResolution = performance.now();
@ -309,7 +309,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting(
userView,
'circular-user',
node.lossless,
node.hyperview,
{ maxDepth: 3 }
);

View File

@ -1,13 +1,13 @@
/**
* Tests for lossless view compose() and decompose() bidirectional conversion
* Ensures that deltas can be composed into lossless views and decomposed back
* Tests for hyperview view compose() and decompose() bidirectional conversion
* Ensures that deltas can be composed into hyperview views and decomposed back
* to the original deltas with all pointer relationships preserved.
*/
import { RhizomeNode } from '@src/node';
import { createDelta } from '@src/core/delta-builder';
describe('Lossless View Compose/Decompose', () => {
describe('Hyperview View Compose/Decompose', () => {
let node: RhizomeNode;
beforeEach(() => {
@ -29,10 +29,10 @@ describe('Lossless View Compose/Decompose', () => {
];
// Ingest the deltas
nameDeltas.forEach(delta => node.lossless.ingestDelta(delta));
nameDeltas.forEach(delta => node.hyperview.ingestDelta(delta));
// Compose lossless view
const composed = node.lossless.compose(['alice']);
// Compose hyperview view
const composed = node.hyperview.compose(['alice']);
const aliceView = composed['alice'];
expect(aliceView).toBeDefined();
@ -41,7 +41,7 @@ describe('Lossless View Compose/Decompose', () => {
expect(aliceView.propertyDeltas.email).toHaveLength(1);
// Decompose back to deltas
const decomposed = node.lossless.decompose(aliceView);
const decomposed = node.hyperview.decompose(aliceView);
expect(decomposed).toHaveLength(2);
@ -73,12 +73,12 @@ describe('Lossless View Compose/Decompose', () => {
.addPointer('intensity', 8)
.buildV1();
node.lossless.ingestDelta(relationshipDelta);
node.hyperview.ingestDelta(relationshipDelta);
// Compose and decompose
const composed = node.lossless.compose(['alice']);
const composed = node.hyperview.compose(['alice']);
const aliceView = composed['alice'];
const decomposed = node.lossless.decompose(aliceView);
const decomposed = node.hyperview.decompose(aliceView);
expect(decomposed).toHaveLength(1);
const reconstituted = decomposed[0];
@ -119,16 +119,16 @@ describe('Lossless View Compose/Decompose', () => {
.addPointer('friend', 'bob', 'friends')
.buildV1();
[aliceDelta, bobDelta, friendshipDelta].forEach(d => node.lossless.ingestDelta(d));
[aliceDelta, bobDelta, friendshipDelta].forEach(d => node.hyperview.ingestDelta(d));
// Compose Alice's view
const composed = node.lossless.compose(['alice']);
const composed = node.hyperview.compose(['alice']);
const aliceView = composed['alice'];
expect(aliceView.propertyDeltas.friends).toHaveLength(1);
// Decompose and verify the friendship delta is correctly reconstructed
const decomposed = node.lossless.decompose(aliceView);
const decomposed = node.hyperview.decompose(aliceView);
const friendshipReconstituted = decomposed.find(d =>
d.pointers.some(p => p.localContext === 'friend')
);
@ -152,10 +152,10 @@ describe('Lossless View Compose/Decompose', () => {
.addPointer('name', 'Alice')
.buildV1();
node.lossless.ingestDelta(originalDelta);
node.hyperview.ingestDelta(originalDelta);
const composed = node.lossless.compose(['alice']);
const decomposed = node.lossless.decompose(composed['alice']);
const composed = node.hyperview.compose(['alice']);
const decomposed = node.hyperview.decompose(composed['alice']);
expect(decomposed).toHaveLength(1);
const reconstituted = decomposed[0];
@ -184,15 +184,15 @@ describe('Lossless View Compose/Decompose', () => {
.buildV1()
];
nameDeltas.forEach(d => node.lossless.ingestDelta(d));
nameDeltas.forEach(d => node.hyperview.ingestDelta(d));
const composed = node.lossless.compose(['alice']);
const composed = node.hyperview.compose(['alice']);
const aliceView = composed['alice'];
// Should have 3 deltas for the name property
expect(aliceView.propertyDeltas.name).toHaveLength(3);
const decomposed = node.lossless.decompose(aliceView);
const decomposed = node.hyperview.decompose(aliceView);
// Should decompose back to 3 separate deltas
expect(decomposed).toHaveLength(3);

View File

@ -1,6 +1,6 @@
import { createDelta } from '@src/core/delta-builder';
import { DeltaV1, DeltaV2 } from '@src/core/delta';
import { Lossless } from '@src/views/lossless';
import { Hyperview } from '@src/views/hyperview';
import { RhizomeNode } from '@src/node';
import { TimestampResolver } from '@src/views/resolvers/timestamp-resolvers';
@ -45,10 +45,10 @@ describe('DeltaBuilder', () => {
});
// Verify that the entity property resolves correctly
const lossless = new Lossless(node);
const lossy = new TimestampResolver(lossless);
lossless.ingestDelta(delta);
const result = lossy.resolve();
const hyperview = new Hyperview(node);
const view = new TimestampResolver(hyperview);
hyperview.ingestDelta(delta);
const result = view.resolve();
expect(result).toBeDefined();
expect(result!['entity-1'].properties.name).toBe('Test Entity');
});
@ -70,10 +70,10 @@ describe('DeltaBuilder', () => {
});
// Verify that the entity property resolves correctly
const lossless = new Lossless(node);
const lossy = new TimestampResolver(lossless);
lossless.ingestDelta(delta);
const result = lossy.resolve();
const hyperview = new Hyperview(node);
const view = new TimestampResolver(hyperview);
hyperview.ingestDelta(delta);
const result = view.resolve();
expect(result).toBeDefined();
expect(result!['entity-1'].properties.name).toBe('Test Entity');
});
@ -170,10 +170,10 @@ describe('DeltaBuilder', () => {
expect(delta.pointers).toHaveProperty('target', 'user-2');
expect(delta.pointers).toHaveProperty('type', 'follows');
const lossless = new Lossless(node);
const lossy = new TimestampResolver(lossless);
lossless.ingestDelta(delta);
const result = lossy.resolve([relId]);
const hyperview = new Hyperview(node);
const view = new TimestampResolver(hyperview);
hyperview.ingestDelta(delta);
const result = view.resolve([relId]);
expect(result).toBeDefined();
expect(result![relId]).toMatchObject({
properties: {
@ -200,10 +200,10 @@ describe('DeltaBuilder', () => {
expect(delta.pointers).toHaveProperty('type', 'follows');
expect(delta.pointers).toHaveProperty('version', 1);
const lossless = new Lossless(node);
const lossy = new TimestampResolver(lossless);
lossless.ingestDelta(delta);
const result = lossy.resolve([relId]);
const hyperview = new Hyperview(node);
const view = new TimestampResolver(hyperview);
hyperview.ingestDelta(delta);
const result = view.resolve([relId]);
expect(result).toBeDefined();
expect(result![relId]).toMatchObject({
properties: {

View File

@ -2,17 +2,17 @@ import Debug from 'debug';
import { createDelta } from '@src/core/delta-builder';
import { NegationHelper } from '@src/features';
import { RhizomeNode } from '@src/node';
import { Lossless } from '@src/views';
import { Hyperview } from '@src/views';
const debug = Debug('rz:negation:test');
describe('Negation System', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
describe('Negation Helper', () => {
@ -179,8 +179,8 @@ describe('Negation System', () => {
});
});
describe('Lossless View Integration', () => {
test('should filter negated deltas in lossless views', () => {
describe('Hyperview View Integration', () => {
test('should filter negated deltas in hyperview views', () => {
// Create original delta
const originalDelta = createDelta('user1', 'host1')
.setProperty('user123', 'name', 'Alice')
@ -198,12 +198,12 @@ describe('Negation System', () => {
.buildV1();
// Ingest all deltas
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta);
lossless.ingestDelta(nonNegatedDelta);
hyperview.ingestDelta(originalDelta);
hyperview.ingestDelta(negationDelta);
hyperview.ingestDelta(nonNegatedDelta);
// Get view - should only show non-negated delta
const view = lossless.compose(['user123']);
const view = hyperview.compose(['user123']);
expect(view.user123).toBeDefined();
@ -220,11 +220,11 @@ describe('Negation System', () => {
const negation1 = createDelta('mod1', 'host1').negate(originalDelta.id).buildV1();
const negation2 = createDelta('mod2', 'host1').negate(originalDelta.id).buildV1();
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negation1);
lossless.ingestDelta(negation2);
hyperview.ingestDelta(originalDelta);
hyperview.ingestDelta(negation1);
hyperview.ingestDelta(negation2);
const view = lossless.compose(['post1']);
const view = hyperview.compose(['post1']);
// Original delta should be negated (not visible)
expect(view.post1).toBeUndefined();
@ -241,11 +241,11 @@ describe('Negation System', () => {
const negation1 = createDelta('mod1', 'host1').negate(delta1.id).buildV1();
lossless.ingestDelta(delta1);
lossless.ingestDelta(delta2);
lossless.ingestDelta(negation1);
hyperview.ingestDelta(delta1);
hyperview.ingestDelta(delta2);
hyperview.ingestDelta(negation1);
const stats = lossless.getNegationStats('article1');
const stats = hyperview.getNegationStats('article1');
expect(stats.totalDeltas).toBe(3);
expect(stats.negationDeltas).toBe(1);
@ -262,10 +262,10 @@ describe('Negation System', () => {
const negationDelta = createDelta('admin', 'host1').negate(originalDelta.id).buildV1();
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta);
hyperview.ingestDelta(originalDelta);
hyperview.ingestDelta(negationDelta);
const negations = lossless.getNegationDeltas('task1');
const negations = hyperview.getNegationDeltas('task1');
expect(negations).toHaveLength(1);
expect(negations[0].id).toBe(negationDelta.id);
expect(negations[0].creator).toBe('admin');
@ -275,7 +275,7 @@ describe('Negation System', () => {
const transactionId = 'tx-negation';
// Create transaction declaration
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
@ -294,11 +294,11 @@ describe('Negation System', () => {
targetContext: 'deltas'
});
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta);
hyperview.ingestDelta(originalDelta);
hyperview.ingestDelta(negationDelta);
// Transaction should complete, but original delta should be negated
const view = lossless.compose(['post1']);
const view = hyperview.compose(['post1']);
expect(view.post1).toBeUndefined(); // No visible deltas
});
@ -321,11 +321,11 @@ describe('Negation System', () => {
.setProperty('post1', 'content', 'Edited post')
.buildV1();
lossless.ingestDelta(postDelta);
lossless.ingestDelta(negationDelta);
lossless.ingestDelta(editDelta);
hyperview.ingestDelta(postDelta);
hyperview.ingestDelta(negationDelta);
hyperview.ingestDelta(editDelta);
const view = lossless.compose(['post1']);
const view = hyperview.compose(['post1']);
// Should show edited content (edit happened after negation)
expect(view.post1).toBeDefined();
@ -341,10 +341,10 @@ describe('Negation System', () => {
test('should handle negation of non-existent deltas', () => {
const negationDelta = createDelta('moderator', 'host1').negate('non-existent-delta-id').buildV1();
lossless.ingestDelta(negationDelta);
hyperview.ingestDelta(negationDelta);
// Should not crash and stats should reflect the orphaned negation
const stats = lossless.getNegationStats('entity1');
const stats = hyperview.getNegationStats('entity1');
expect(stats.negationDeltas).toBe(0); // No negations for this entity
});
@ -357,16 +357,16 @@ describe('Negation System', () => {
const negationDelta = createDelta('admin', 'host1').negate(selfRefDelta.id).buildV1();
lossless.ingestDelta(selfRefDelta);
lossless.ingestDelta(negationDelta);
hyperview.ingestDelta(selfRefDelta);
hyperview.ingestDelta(negationDelta);
const view = lossless.compose(['node1']);
const view = hyperview.compose(['node1']);
expect(view.node1).toBeUndefined(); // Should be negated
});
test('should handle multiple direct negations of the same delta', () => {
const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode);
const testHyperview = new Hyperview(testNode);
// Create the original delta
const originalDelta = createDelta('user1', 'host1')
@ -378,18 +378,18 @@ describe('Negation System', () => {
const negation2 = createDelta('user3', 'host1').negate(originalDelta.id).buildV1();
// Process all deltas
testLossless.ingestDelta(originalDelta);
testLossless.ingestDelta(negation1);
testLossless.ingestDelta(negation2);
testHyperview.ingestDelta(originalDelta);
testHyperview.ingestDelta(negation1);
testHyperview.ingestDelta(negation2);
// Get the view after processing all deltas
const view = testLossless.compose(['entity2']);
const view = testHyperview.compose(['entity2']);
// The original delta should be negated (not in view) because it has two direct negations
expect(view.entity2).toBeUndefined();
// Verify the stats
const stats = testLossless.getNegationStats('entity2');
const stats = testHyperview.getNegationStats('entity2');
expect(stats.negationDeltas).toBe(2);
expect(stats.negatedDeltas).toBe(1);
expect(stats.effectiveDeltas).toBe(0);
@ -397,7 +397,7 @@ describe('Negation System', () => {
test('should handle complex negation chains', () => {
const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode);
const testHyperview = new Hyperview(testNode);
// Create the original delta
const deltaA = createDelta('user1', 'host1')
@ -415,13 +415,13 @@ describe('Negation System', () => {
debug('Delta D (negates C): %s', deltaD.id);
// Process all deltas in order
testLossless.ingestDelta(deltaA);
testLossless.ingestDelta(deltaB);
testLossless.ingestDelta(deltaC);
testLossless.ingestDelta(deltaD);
testHyperview.ingestDelta(deltaA);
testHyperview.ingestDelta(deltaB);
testHyperview.ingestDelta(deltaC);
testHyperview.ingestDelta(deltaD);
// Get the view after processing all deltas
const view = testLossless.compose(['entity3']);
const view = testHyperview.compose(['entity3']);
// The original delta should be negated because:
// - B negates A
@ -433,7 +433,7 @@ describe('Negation System', () => {
const allDeltas = [deltaA, deltaB, deltaC, deltaD];
// Get the stats
const stats = testLossless.getNegationStats('entity3');
const stats = testHyperview.getNegationStats('entity3');
const isANegated = NegationHelper.isDeltaNegated(deltaA.id, allDeltas);
const isBNegated = NegationHelper.isDeltaNegated(deltaB.id, allDeltas);
const isCNegated = NegationHelper.isDeltaNegated(deltaC.id, allDeltas);
@ -470,7 +470,7 @@ describe('Negation System', () => {
test('should handle multiple independent negations', () => {
const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode);
const testHyperview = new Hyperview(testNode);
// Create two independent deltas
const delta1 = createDelta('user1', 'host1')
@ -486,19 +486,19 @@ describe('Negation System', () => {
const negation2 = createDelta('user4', 'host1').negate(delta2.id).buildV1();
// Process all deltas
testLossless.ingestDelta(delta1);
testLossless.ingestDelta(delta2);
testLossless.ingestDelta(negation1);
testLossless.ingestDelta(negation2);
testHyperview.ingestDelta(delta1);
testHyperview.ingestDelta(delta2);
testHyperview.ingestDelta(negation1);
testHyperview.ingestDelta(negation2);
// Get the view after processing all deltas
const view = testLossless.compose(['entity4']);
const view = testHyperview.compose(['entity4']);
// Both deltas should be negated
expect(view.entity4).toBeUndefined();
// Verify the stats
const stats = testLossless.getNegationStats('entity4');
const stats = testHyperview.getNegationStats('entity4');
expect(stats.negationDeltas).toBe(2);
expect(stats.negatedDeltas).toBe(2);
expect(stats.effectiveDeltas).toBe(0);

View File

@ -1,15 +1,15 @@
import { createDelta } from '@src/core/delta-builder';
import { Lossless } from '@src/views';
import { Hyperview } from '@src/views';
import { RhizomeNode } from '@src/node';
import { DeltaFilter } from '@src/core';
describe('Transactions', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
describe('Transaction-based filtering', () => {
@ -34,12 +34,12 @@ describe('Transactions', () => {
.buildV1();
// Ingest transaction declaration and first two deltas
lossless.ingestDelta(txDeclaration);
lossless.ingestDelta(delta1);
lossless.ingestDelta(delta2);
hyperview.ingestDelta(txDeclaration);
hyperview.ingestDelta(delta1);
hyperview.ingestDelta(delta2);
// View should be empty because transaction is incomplete (2/3 deltas)
const view = lossless.compose(['user123']);
const view = hyperview.compose(['user123']);
expect(view.user123).toBeUndefined();
// Add the third delta to complete the transaction
@ -48,10 +48,10 @@ describe('Transactions', () => {
.setProperty('user123', 'email', 'alice@example.com')
.buildV1();
lossless.ingestDelta(delta3);
hyperview.ingestDelta(delta3);
// Now the view should include all deltas from the completed transaction
const completeView = lossless.compose(['user123']);
const completeView = hyperview.compose(['user123']);
expect(completeView.user123).toBeDefined();
expect(completeView.user123.propertyDeltas.name).toHaveLength(1);
expect(completeView.user123.propertyDeltas.age).toHaveLength(1);
@ -63,57 +63,57 @@ describe('Transactions', () => {
const tx2 = 'tx-002';
// Declare two transactions
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(tx1, 2)
.buildV1()
);
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(tx2, 2)
.buildV1()
);
// Add deltas for both transactions
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(tx1)
.setProperty('order1', 'status', 'pending')
.buildV1()
);
lossless.ingestDelta(createDelta('user2', 'host2')
hyperview.ingestDelta(createDelta('user2', 'host2')
.inTransaction(tx2)
.setProperty('order2', 'status', 'shipped')
.buildV1()
);
// Neither transaction is complete
let view = lossless.compose(['order1', 'order2']);
let view = hyperview.compose(['order1', 'order2']);
expect(view.order1).toBeUndefined();
expect(view.order2).toBeUndefined();
// Complete tx1
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(tx1)
.setProperty('order1', 'total', 100)
.buildV1()
);
// tx1 is complete, tx2 is not
view = lossless.compose(['order1', 'order2']);
view = hyperview.compose(['order1', 'order2']);
expect(view.order1).toBeDefined();
expect(view.order1.propertyDeltas.status).toHaveLength(1);
expect(view.order1.propertyDeltas.total).toHaveLength(1);
expect(view.order2).toBeUndefined();
// Complete tx2
lossless.ingestDelta(createDelta('user2', 'host2')
hyperview.ingestDelta(createDelta('user2', 'host2')
.inTransaction(tx2)
.setProperty('order2', 'tracking', 'TRACK123')
.buildV1()
);
// Both transactions complete
view = lossless.compose(['order1', 'order2']);
view = hyperview.compose(['order1', 'order2']);
expect(view.order1).toBeDefined();
expect(view.order2).toBeDefined();
expect(view.order2.propertyDeltas.status).toHaveLength(1);
@ -124,19 +124,19 @@ describe('Transactions', () => {
const transactionId = 'tx-filter-test';
// Create transaction with 2 deltas
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
// Add both deltas
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('doc1', 'type', 'report')
.buildV1()
);
lossless.ingestDelta(createDelta('user2', 'host2')
hyperview.ingestDelta(createDelta('user2', 'host2')
.inTransaction(transactionId)
.setProperty('doc1', 'author', 'Bob')
.buildV1()
@ -147,7 +147,7 @@ describe('Transactions', () => {
// With incomplete transaction, nothing should show
// But once complete, the filter should still apply
const view = lossless.compose(['doc1'], userFilter);
const view = hyperview.compose(['doc1'], userFilter);
// Even though transaction is complete, only delta from user1 should appear
expect(view.doc1).toBeDefined();
@ -159,13 +159,13 @@ describe('Transactions', () => {
const transactionId = 'tx-multi-entity';
// Transaction that updates multiple entities atomically
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 3)
.buildV1()
);
// Transfer money from account1 to account2
lossless.ingestDelta(createDelta('bank', 'host1')
hyperview.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId)
.addPointer('balance', 'account1', 'balance')
.addPointer('value', 900)
@ -173,7 +173,7 @@ describe('Transactions', () => {
.buildV1()
);
lossless.ingestDelta(createDelta('bank', 'host1')
hyperview.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId)
.addPointer('balance', 'account2', 'balance')
.addPointer('value', 1100)
@ -182,12 +182,12 @@ describe('Transactions', () => {
);
// Transaction incomplete - no entities should show updates
let view = lossless.compose(['account1', 'account2']);
let view = hyperview.compose(['account1', 'account2']);
expect(view.account1).toBeUndefined();
expect(view.account2).toBeUndefined();
// Complete transaction with audit log
lossless.ingestDelta(createDelta('bank', 'host1')
hyperview.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId)
.addPointer('transfer', 'transfer123', 'details')
.addPointer('from', 'account1')
@ -197,7 +197,7 @@ describe('Transactions', () => {
);
// All entities should now be visible
view = lossless.compose(['account1', 'account2', 'transfer123']);
view = hyperview.compose(['account1', 'account2', 'transfer123']);
expect(view.account1).toBeDefined();
expect(view.account1.propertyDeltas.balance).toHaveLength(1);
expect(view.account2).toBeDefined();
@ -211,12 +211,12 @@ describe('Transactions', () => {
const updateEvents: Array<{ entityId: string, deltaIds: string[] }> = [];
// Listen for update events
lossless.eventStream.on('updated', (entityId: string, deltaIds: string[]) => {
hyperview.eventStream.on('updated', (entityId: string, deltaIds: string[]) => {
updateEvents.push({ entityId, deltaIds });
});
// Create transaction
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
@ -226,7 +226,7 @@ describe('Transactions', () => {
.inTransaction(transactionId)
.setProperty('entity1', 'field1', 'value1')
.buildV1();
lossless.ingestDelta(delta1);
hyperview.ingestDelta(delta1);
// No events should be emitted yet
expect(updateEvents).toHaveLength(0);
@ -236,7 +236,7 @@ describe('Transactions', () => {
.inTransaction(transactionId)
.setProperty('entity1', 'field2', 'value2')
.buildV1();
lossless.ingestDelta(delta2);
hyperview.ingestDelta(delta2);
// Wait for async event processing
await new Promise(resolve => setTimeout(resolve, 10));
@ -256,20 +256,20 @@ describe('Transactions', () => {
const transactionId = 'tx-wait';
// Create transaction
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
// Add first delta
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('job1', 'status', 'processing')
.buildV1()
);
// Start waiting for transaction
const waitPromise = lossless.transactions.waitFor(transactionId);
const waitPromise = hyperview.transactions.waitFor(transactionId);
let isResolved = false;
waitPromise.then(() => { isResolved = true; });
@ -278,7 +278,7 @@ describe('Transactions', () => {
expect(isResolved).toBe(false);
// Complete transaction
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('job1', 'status', 'completed')
.buildV1()
@ -289,7 +289,7 @@ describe('Transactions', () => {
expect(isResolved).toBe(true);
// View should show completed transaction
const view = lossless.compose(['job1']);
const view = hyperview.compose(['job1']);
expect(view.job1).toBeDefined();
expect(view.job1.propertyDeltas.status).toHaveLength(2);
});
@ -302,14 +302,14 @@ describe('Transactions', () => {
.buildV1();
const updateEvents: string[] = [];
lossless.eventStream.on('updated', (entityId: string) => {
hyperview.eventStream.on('updated', (entityId: string) => {
updateEvents.push(entityId);
});
lossless.ingestDelta(regularDelta);
hyperview.ingestDelta(regularDelta);
// Should immediately appear in view
const view = lossless.compose(['user456']);
const view = hyperview.compose(['user456']);
expect(view.user456).toBeDefined();
expect(view.user456.propertyDeltas.name).toHaveLength(1);
@ -323,29 +323,29 @@ describe('Transactions', () => {
const transactionId = 'tx-resize';
// Initially declare transaction with size 2
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
// Add 2 deltas
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('cart1', 'items', 'item1')
.buildV1()
);
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('cart1', 'items', 'item2')
.buildV1()
);
// Transaction should be complete
expect(lossless.transactions.isComplete(transactionId)).toBe(true);
expect(hyperview.transactions.isComplete(transactionId)).toBe(true);
// View should show the cart
const view = lossless.compose(['cart1']);
const view = hyperview.compose(['cart1']);
expect(view.cart1).toBeDefined();
});
@ -353,30 +353,30 @@ describe('Transactions', () => {
const transactionId = 'tx-no-size';
// Add delta with transaction reference but no size declaration
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('entity1', 'data', 'test')
.buildV1()
);
// Transaction should not be complete (no size)
expect(lossless.transactions.isComplete(transactionId)).toBe(false);
expect(hyperview.transactions.isComplete(transactionId)).toBe(false);
// Delta should not appear in view
const view = lossless.compose(['entity1']);
const view = hyperview.compose(['entity1']);
expect(view.entity1).toBeUndefined();
// Declare size after the fact
lossless.ingestDelta(createDelta('system', 'host1')
hyperview.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 1)
.buildV1()
);
// Now transaction should be complete
expect(lossless.transactions.isComplete(transactionId)).toBe(true);
expect(hyperview.transactions.isComplete(transactionId)).toBe(true);
// And delta should appear in view
const viewAfter = lossless.compose(['entity1']);
const viewAfter = hyperview.compose(['entity1']);
expect(viewAfter.entity1).toBeDefined();
});
});

View File

@ -1,5 +1,5 @@
import { QueryEngine } from '@src/query';
import { Lossless } from '@src/views';
import { Hyperview } from '@src/views';
import { DefaultSchemaRegistry } from '@src/schema';
import { SchemaBuilder, PrimitiveSchemas } from '@src/schema';
import { CommonSchemas } from '../../../util/schemas';
@ -8,7 +8,7 @@ import { RhizomeNode } from '@src/node';
describe('Query Engine', () => {
let queryEngine: QueryEngine;
let lossless: Lossless;
let hyperview: Hyperview;
let schemaRegistry: DefaultSchemaRegistry;
let rhizomeNode: RhizomeNode;
@ -19,9 +19,9 @@ describe('Query Engine', () => {
requestBindPort: 4003
});
lossless = rhizomeNode.lossless;
hyperview = rhizomeNode.hyperview;
schemaRegistry = new DefaultSchemaRegistry();
queryEngine = new QueryEngine(lossless, schemaRegistry);
queryEngine = new QueryEngine(hyperview, schemaRegistry);
// Register test schemas
schemaRegistry.register(CommonSchemas.User());
@ -53,7 +53,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'name', name, 'user')
.buildV1();
lossless.ingestDelta(nameDelta);
hyperview.ingestDelta(nameDelta);
// Add age if provided
if (age !== undefined) {
@ -62,7 +62,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'age', age, 'user')
.buildV1();
lossless.ingestDelta(ageDelta);
hyperview.ingestDelta(ageDelta);
}
// Add email if provided
@ -72,7 +72,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'email', email, 'user')
.buildV1();
lossless.ingestDelta(emailDelta);
hyperview.ingestDelta(emailDelta);
}
}
@ -83,7 +83,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'title', title, 'post')
.buildV1();
lossless.ingestDelta(titleDelta);
hyperview.ingestDelta(titleDelta);
// Author delta
const authorDelta = createDelta('test', 'test-host')
@ -91,7 +91,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'author', author, 'post')
.buildV1();
lossless.ingestDelta(authorDelta);
hyperview.ingestDelta(authorDelta);
// Published delta
const publishedDelta = createDelta('test', 'test-host')
@ -99,7 +99,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'published', published, 'post')
.buildV1();
lossless.ingestDelta(publishedDelta);
hyperview.ingestDelta(publishedDelta);
// Views delta
const viewsDelta = createDelta('test', 'test-host')
@ -107,7 +107,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'views', views, 'post')
.buildV1();
lossless.ingestDelta(viewsDelta);
hyperview.ingestDelta(viewsDelta);
}
describe('Basic Query Operations', () => {

View File

@ -1,12 +1,12 @@
import {DeltaFilter} from '@src/core';
import {Lossless} from '@src/views';
import {Hyperview} from '@src/views';
import {RhizomeNode} from '@src/node';
import {createDelta} from '@src/core/delta-builder';
describe('Lossless', () => {
describe('Hyperview', () => {
const node = new RhizomeNode();
test('creates a lossless view of keanu as neo in the matrix', () => {
test('creates a hyperview view of keanu as neo in the matrix', () => {
const delta = createDelta('a', 'h')
.addPointer('actor', 'keanu', 'roles')
.addPointer('role', 'neo', 'actor')
@ -35,11 +35,11 @@ describe('Lossless', () => {
target: "usd"
}]);
const lossless = new Lossless(node);
const hyperview = new Hyperview(node);
lossless.ingestDelta(delta);
hyperview.ingestDelta(delta);
expect(lossless.compose()).toMatchObject({
expect(hyperview.compose()).toMatchObject({
keanu: {
referencedAs: ["actor"],
propertyDeltas: {
@ -100,11 +100,11 @@ describe('Lossless', () => {
.addPointer('salary_currency', 'usd')
.buildV2();
const lossless = new Lossless(node);
const hyperview = new Hyperview(node);
lossless.ingestDelta(delta);
hyperview.ingestDelta(delta);
expect(lossless.compose()).toMatchObject({
expect(hyperview.compose()).toMatchObject({
keanu: {
referencedAs: ["actor"],
propertyDeltas: {
@ -157,18 +157,18 @@ describe('Lossless', () => {
});
describe('can filter deltas', () => {
const lossless = new Lossless(node);
const hyperview = new Hyperview(node);
beforeAll(() => {
// First delta
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('A', 'H')
.setProperty('ace', 'value', '1', 'ace')
.buildV1()
);
// Second delta
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('B', 'H')
// 10 11j 12q 13k 14a
// .addPointer('14', 'ace', 'value')
@ -176,7 +176,7 @@ describe('Lossless', () => {
.buildV1()
);
expect(lossless.compose()).toMatchObject({
expect(hyperview.compose()).toMatchObject({
ace: {
referencedAs: ["ace"],
propertyDeltas: {
@ -205,7 +205,7 @@ describe('Lossless', () => {
return creator === 'A' && host === 'H';
};
expect(lossless.compose(undefined, filter)).toMatchObject({
expect(hyperview.compose(undefined, filter)).toMatchObject({
ace: {
referencedAs: ["ace"],
propertyDeltas: {
@ -221,7 +221,7 @@ describe('Lossless', () => {
}
});
expect(lossless.compose(["ace"], filter)).toMatchObject({
expect(hyperview.compose(["ace"], filter)).toMatchObject({
ace: {
referencedAs: ["ace"],
propertyDeltas: {
@ -239,18 +239,18 @@ describe('Lossless', () => {
});
test('filter with transactions', () => {
const losslessT = new Lossless(node);
const hyperviewT = new Hyperview(node);
const transactionId = 'tx-filter-test';
// Declare transaction with 3 deltas
losslessT.ingestDelta(
hyperviewT.ingestDelta(
createDelta('system', 'H')
.declareTransaction(transactionId, 3)
.buildV1()
);
// A1: First delta from creator A
losslessT.ingestDelta(
hyperviewT.ingestDelta(
createDelta('A', 'H')
.inTransaction(transactionId)
.setProperty('process1', 'status', 'started', 'step')
@ -258,7 +258,7 @@ describe('Lossless', () => {
);
// B: Delta from creator B
losslessT.ingestDelta(
hyperviewT.ingestDelta(
createDelta('B', 'H')
.inTransaction(transactionId)
.setProperty('process1', 'status', 'processing', 'step')
@ -266,11 +266,11 @@ describe('Lossless', () => {
);
// Transaction incomplete - nothing should show
const incompleteView = losslessT.compose(['process1']);
const incompleteView = hyperviewT.compose(['process1']);
expect(incompleteView.process1).toBeUndefined();
// A2: Second delta from creator A completes transaction
losslessT.ingestDelta(
hyperviewT.ingestDelta(
createDelta('A', 'H')
.inTransaction(transactionId)
.addPointer('step', 'process1', 'status')
@ -279,13 +279,13 @@ describe('Lossless', () => {
);
// All deltas visible now
const completeView = losslessT.compose(['process1']);
const completeView = hyperviewT.compose(['process1']);
expect(completeView.process1).toBeDefined();
expect(completeView.process1.propertyDeltas.status).toHaveLength(3);
// Filter by creator A only
const filterA: DeltaFilter = ({creator}) => creator === 'A';
const filteredView = losslessT.compose(['process1'], filterA);
const filteredView = hyperviewT.compose(['process1'], filterA);
expect(filteredView.process1).toBeDefined();
expect(filteredView.process1.propertyDeltas.status).toHaveLength(2);

View File

@ -1,12 +1,12 @@
import Debug from 'debug';
import { PointerTarget } from "@src/core/delta";
import { Lossless, LosslessViewOne } from "@src/views/lossless";
import { Lossy } from "@src/views/lossy";
import { Hyperview, HyperviewViewOne } from "@src/views/hyperview";
import { Lossy } from "@src/views/view";
import { RhizomeNode } from "@src/node";
import { valueFromDelta } from "@src/views/lossless";
import { valueFromDelta } from "@src/views/hyperview";
import { latestFromCollapsedDeltas } from "@src/views/resolvers/timestamp-resolvers";
import { createDelta } from "@src/core/delta-builder";
const debug = Debug('rz:test:lossy');
const debug = Debug('rz:test:view');
type Role = {
actor: PointerTarget,
@ -21,9 +21,9 @@ type Summary = {
class Summarizer extends Lossy<Summary> {
private readonly debug: debug.Debugger;
constructor(lossless: Lossless) {
super(lossless);
this.debug = Debug('rz:test:lossy:summarizer');
constructor(hyperview: Hyperview) {
super(hyperview);
this.debug = Debug('rz:test:view:summarizer');
}
initializer(): Summary {
@ -37,9 +37,9 @@ class Summarizer extends Lossy<Summary> {
// it's really not CRDT, it likely depends on the order of the pointers.
// TODO: Prove with failing test
reducer(acc: Summary, cur: LosslessViewOne): Summary {
reducer(acc: Summary, cur: HyperviewViewOne): Summary {
this.debug(`Processing view for entity ${cur.id} (referenced as: ${cur.referencedAs?.join(', ')})`);
this.debug(`lossless view:`, JSON.stringify(cur));
this.debug(`hyperview view:`, JSON.stringify(cur));
if (cur.referencedAs?.includes("role")) {
this.debug(`Found role entity: ${cur.id}`);
@ -92,12 +92,12 @@ class Summarizer extends Lossy<Summary> {
describe('Lossy', () => {
describe('use a provided initializer, reducer, and resolver to resolve entity views', () => {
const node = new RhizomeNode();
const lossless = new Lossless(node);
const hyperview = new Hyperview(node);
const lossy = new Summarizer(lossless);
const view = new Summarizer(hyperview);
beforeAll(() => {
lossless.ingestDelta(createDelta('a', 'h')
hyperview.ingestDelta(createDelta('a', 'h')
.addPointer('actor', 'keanu', 'roles')
.addPointer('role', 'neo', 'actor')
.addPointer('film', 'the_matrix', 'cast')
@ -108,7 +108,7 @@ describe('Lossy', () => {
});
test('example summary', () => {
const result = lossy.resolve();
const result = view.resolve();
debug('result', result);
expect(result).toEqual({
roles: [{

View File

@ -85,10 +85,10 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('salary', 15000000)
.addPointer('contract_date', '1999-03-31')
.buildV1();
node.lossless.ingestDelta(castingDelta);
node.hyperview.ingestDelta(castingDelta);
// Test from Keanu's perspective
const keanuViews = node.lossless.compose(['keanu']);
const keanuViews = node.hyperview.compose(['keanu']);
const keanuView = keanuViews['keanu'];
expect(keanuView.propertyDeltas.filmography).toBeDefined();
@ -97,7 +97,7 @@ describe('Multi-Pointer Delta Resolution', () => {
const nestedKeanuView = schemaRegistry.applySchemaWithNesting(
keanuView,
'actor',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -117,13 +117,13 @@ describe('Multi-Pointer Delta Resolution', () => {
}
// Test from Matrix's perspective
const matrixViews = node.lossless.compose(['matrix']);
const matrixViews = node.hyperview.compose(['matrix']);
const matrixView = matrixViews['matrix'];
const nestedMatrixView = schemaRegistry.applySchemaWithNesting(
matrixView,
'movie',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -169,16 +169,16 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('since', '2020-01-15')
.addPointer('intensity', 8)
.buildV1();
node.lossless.ingestDelta(relationshipDelta);
node.hyperview.ingestDelta(relationshipDelta);
// Test from Alice's perspective
const aliceViews = node.lossless.compose(['alice']);
const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedAliceView = schemaRegistry.applySchemaWithNesting(
aliceView,
'person',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -244,16 +244,16 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('budget', 50000)
.addPointer('deadline', '2024-06-01')
.buildV1();
node.lossless.ingestDelta(collaborationDelta);
node.hyperview.ingestDelta(collaborationDelta);
// Test from project's perspective
const projectViews = node.lossless.compose(['website']);
const projectViews = node.hyperview.compose(['website']);
const projectView = projectViews['website'];
const nestedProjectView = schemaRegistry.applySchemaWithNesting(
projectView,
'project',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);

View File

@ -59,10 +59,10 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob')
.buildV1();
node.lossless.ingestDelta(friendshipDelta);
node.hyperview.ingestDelta(friendshipDelta);
// Get Alice's lossless view
const aliceViews = node.lossless.compose(['alice']);
// Get Alice's hyperview view
const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice'];
expect(aliceView).toBeDefined();
@ -73,7 +73,7 @@ describe('Nested Object Resolution', () => {
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -107,15 +107,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'nonexistent')
.buildV1();
node.lossless.ingestDelta(friendshipDelta);
node.hyperview.ingestDelta(friendshipDelta);
const aliceViews = node.lossless.compose(['alice']);
const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -158,23 +158,23 @@ describe('Nested Object Resolution', () => {
.addPointer('deep-users', 'alice', 'mentor')
.addPointer('mentor', 'bob')
.buildV1();
node.lossless.ingestDelta(mentorshipDelta1);
node.hyperview.ingestDelta(mentorshipDelta1);
// Bob's mentor is Charlie
const mentorshipDelta2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('deep-users', 'bob', 'mentor')
.addPointer('mentor', 'charlie')
.buildV1();
node.lossless.ingestDelta(mentorshipDelta2);
node.hyperview.ingestDelta(mentorshipDelta2);
const aliceViews = node.lossless.compose(['alice']);
const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice'];
// Test with maxDepth = 1 (should only resolve Alice and Bob)
const shallowView = schemaRegistry.applySchemaWithNesting(
aliceView,
'deep-user',
node.lossless,
node.hyperview,
{ maxDepth: 1 }
);
@ -197,7 +197,7 @@ describe('Nested Object Resolution', () => {
const deepView = schemaRegistry.applySchemaWithNesting(
aliceView,
'deep-user',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -234,22 +234,22 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob')
.buildV1();
node.lossless.ingestDelta(friendship1);
node.hyperview.ingestDelta(friendship1);
const friendship2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('users', 'bob', 'friends')
.addPointer('friends', 'alice')
.buildV1();
node.lossless.ingestDelta(friendship2);
node.hyperview.ingestDelta(friendship2);
const aliceViews = node.lossless.compose(['alice']);
const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice'];
// Should handle circular reference without infinite recursion
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.lossless,
node.hyperview,
{ maxDepth: 3 }
);
@ -275,15 +275,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'alice')
.buildV1();
node.lossless.ingestDelta(selfFriendship);
node.hyperview.ingestDelta(selfFriendship);
const aliceViews = node.lossless.compose(['alice']);
const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -311,21 +311,21 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob')
.buildV1();
node.lossless.ingestDelta(friendship1);
node.hyperview.ingestDelta(friendship1);
const friendship2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'charlie')
.buildV1();
node.lossless.ingestDelta(friendship2);
node.hyperview.ingestDelta(friendship2);
const aliceViews = node.lossless.compose(['alice']);
const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.lossless,
node.hyperview,
{ maxDepth: 2 }
);
@ -373,15 +373,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob')
.buildV1();
node.lossless.ingestDelta(friendship);
node.hyperview.ingestDelta(friendship);
const aliceViews = node.lossless.compose(['alice']);
const aliceViews = node.hyperview.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.lossless,
node.hyperview,
{ maxDepth: 3 }
);

View File

@ -1,6 +1,6 @@
import {
RhizomeNode,
Lossless,
Hyperview,
AggregationResolver,
MinResolver,
MaxResolver,
@ -13,31 +13,31 @@ import { createDelta } from "@src/core/delta-builder";
describe('Aggregation Resolvers', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
describe('Basic Aggregation', () => {
test('should aggregate numbers using min resolver', () => {
const minResolver = new MinResolver(lossless, ['score']);
const minResolver = new MinResolver(hyperview, ['score']);
// Add first entity with score 10
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection')
.buildV1()
);
// Add second entity with score 5
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 5, 'collection')
.buildV1()
);
// Add third entity with score 15
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity3', 'score', 15, 'collection')
.buildV1()
);
@ -52,20 +52,20 @@ describe('Aggregation Resolvers', () => {
});
test('should aggregate numbers using max resolver', () => {
const maxResolver = new MaxResolver(lossless, ['score']);
const maxResolver = new MaxResolver(hyperview, ['score']);
// Add deltas for entities
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 5, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity3', 'score', 15, 'collection')
.buildV1()
);
@ -79,22 +79,22 @@ describe('Aggregation Resolvers', () => {
});
test('should aggregate numbers using sum resolver', () => {
const sumResolver = new SumResolver(lossless, ['value']);
const sumResolver = new SumResolver(hyperview, ['value']);
// Add first value for entity1
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection')
.buildV1()
);
// Add second value for entity1 (should sum)
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 20, 'collection')
.buildV1()
);
// Add value for entity2
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'value', 5, 'collection')
.buildV1()
);
@ -107,21 +107,21 @@ describe('Aggregation Resolvers', () => {
});
test('should aggregate numbers using average resolver', () => {
const avgResolver = new AverageResolver(lossless, ['score']);
const avgResolver = new AverageResolver(hyperview, ['score']);
// Add multiple scores for entity1
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 20, 'collection')
.buildV1()
);
// Single value for entity2
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 30, 'collection')
.buildV1()
);
@ -134,21 +134,21 @@ describe('Aggregation Resolvers', () => {
});
test('should count values using count resolver', () => {
const countResolver = new CountResolver(lossless, ['visits']);
const countResolver = new CountResolver(hyperview, ['visits']);
// Add multiple visit deltas for entity1
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'visits', 1, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'visits', 1, 'collection')
.buildV1()
);
// Single visit for entity2
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'visits', 1, 'collection')
.buildV1()
);
@ -163,40 +163,40 @@ describe('Aggregation Resolvers', () => {
describe('Custom Aggregation Configuration', () => {
test('should handle mixed aggregation types', () => {
const resolver = new AggregationResolver(lossless, {
const resolver = new AggregationResolver(hyperview, {
min_val: 'min' as AggregationType,
max_val: 'max' as AggregationType,
sum_val: 'sum' as AggregationType
});
// Add first set of values
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'min_val', 10, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'max_val', 5, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'sum_val', 3, 'collection')
.buildV1()
);
// Add second set of values
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'min_val', 5, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'max_val', 15, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'sum_val', 7, 'collection')
.buildV1()
);
@ -212,25 +212,25 @@ describe('Aggregation Resolvers', () => {
});
test('should ignore non-numeric values', () => {
const resolver = new AggregationResolver(lossless, {
const resolver = new AggregationResolver(hyperview, {
score: 'sum' as AggregationType,
name: 'count' as AggregationType
});
// Add numeric value
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection')
.buildV1()
);
// Add non-numeric value (string)
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'name', 'test', 'collection')
.buildV1()
);
// Add another numeric value
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 20, 'collection')
.buildV1()
);
@ -244,9 +244,9 @@ describe('Aggregation Resolvers', () => {
});
test('should handle empty value arrays', () => {
const sumResolver = new SumResolver(lossless, ['score']);
const sumResolver = new SumResolver(hyperview, ['score']);
// Create entity with non-aggregated property
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'name', 'test', 'collection')
.buildV1()
);
@ -261,9 +261,9 @@ describe('Aggregation Resolvers', () => {
describe('Edge Cases', () => {
test('should handle single value aggregations', () => {
const avgResolver = new AverageResolver(lossless, ['value']);
const avgResolver = new AverageResolver(hyperview, ['value']);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 42, 'collection')
.buildV1()
);
@ -275,14 +275,14 @@ describe('Aggregation Resolvers', () => {
});
test('should handle zero values', () => {
const sumResolver = new SumResolver(lossless, ['value']);
const sumResolver = new SumResolver(hyperview, ['value']);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 0, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection')
.buildV1()
);
@ -294,14 +294,14 @@ describe('Aggregation Resolvers', () => {
});
test('should handle negative values', () => {
const minResolver = new MinResolver(lossless, ['value']);
const minResolver = new MinResolver(hyperview, ['value']);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', -5, 'collection')
.buildV1()
);
lossless.ingestDelta(createDelta('test', 'host1')
hyperview.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection')
.buildV1()
);

View File

@ -1,17 +1,17 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless, createDelta } from '@src';
import { CollapsedDelta } from '@src/views/lossless';
import { RhizomeNode, Hyperview, createDelta } from '@src';
import { CollapsedDelta } from '@src/views/hyperview';
import { CustomResolver, ResolverPlugin } from '@src/views/resolvers/custom-resolvers';
type PropertyTypes = string | number | boolean | null;
describe('Basic Dependency Resolution', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
test('should resolve dependencies in correct order', () => {
@ -51,20 +51,20 @@ describe('Basic Dependency Resolution', () => {
}
}
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
first: new FirstPlugin(),
second: new SecondPlugin()
});
// Add some data
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test1', 'first', 'hello', 'test')
.buildV1()
);
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(2000)
.setProperty('test1', 'second', 'world', 'test')

View File

@ -1,17 +1,17 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless } from '@src';
import { CollapsedDelta } from '@src/views/lossless';
import { RhizomeNode, Hyperview } from '@src';
import { CollapsedDelta } from '@src/views/hyperview';
import { CustomResolver, ResolverPlugin } from '@src/views/resolvers/custom-resolvers';
type PropertyTypes = string | number | boolean | null;
describe('Circular Dependency Detection', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
test('should detect circular dependencies', () => {
@ -53,7 +53,7 @@ describe('Circular Dependency Detection', () => {
// Should throw an error when circular dependencies are detected
expect(() => {
new CustomResolver(lossless, {
new CustomResolver(hyperview, {
'a': new PluginA(),
'b': new PluginB()
});
@ -84,7 +84,7 @@ describe('Circular Dependency Detection', () => {
// Should detect the circular dependency: a -> c -> b -> a
expect(() => {
new CustomResolver(lossless, {
new CustomResolver(hyperview, {
'a': new PluginA(),
'b': new PluginB(),
'c': new PluginC()

View File

@ -1,5 +1,5 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless, createDelta, CollapsedDelta } from '@src';
import { RhizomeNode, Hyperview, createDelta, CollapsedDelta } from '@src';
import {
CustomResolver,
DependencyStates,
@ -9,11 +9,11 @@ import { PropertyTypes } from '@src/core/types';
describe('Edge Cases', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
test('should handle null values', () => {
@ -45,11 +45,11 @@ describe('Edge Cases', () => {
}
}
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
value: new NullSafeLastWriteWinsPlugin()
});
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test2', 'value', null, 'test')
@ -95,19 +95,19 @@ describe('Edge Cases', () => {
}
}
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
value: new ConcurrentUpdatePlugin()
});
// Two updates with the same timestamp
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test2', 'value', null, 'test')
.buildV1()
);
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user2', 'host2')
.withTimestamp(1000) // Same timestamp
.setProperty('test2', 'value', 'xylophone', 'test')
@ -149,13 +149,13 @@ describe('Edge Cases', () => {
}
}
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
counter: new CounterPlugin()
});
// Add 1000 updates
for (let i = 0; i < 1000; i++) {
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000 + i)
.setProperty('test3', 'counter', i, 'test')
@ -193,7 +193,7 @@ describe('Edge Cases', () => {
}
}
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
missing: new MissingPropertyPlugin()
});

View File

@ -1,6 +1,6 @@
import { PropertyID } from '@src/core/types';
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless, createDelta } from '@src';
import { RhizomeNode, Hyperview, createDelta } from '@src';
import {
CustomResolver,
LastWriteWinsPlugin,
@ -10,7 +10,7 @@ import {
ResolverPlugin
} from '@src/views/resolvers/custom-resolvers';
import Debug from 'debug';
const debug = Debug('rz:test:lossless');
const debug = Debug('rz:test:hyperview');
// A simple plugin that depends on other plugins
class AveragePlugin<Targets extends PropertyID> extends ResolverPlugin<{ initialized: boolean }> {
@ -49,15 +49,15 @@ class AveragePlugin<Targets extends PropertyID> extends ResolverPlugin<{ initial
describe('Multiple Plugins Integration', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
test('should handle multiple plugins with dependencies', () => {
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
temperature: new LastWriteWinsPlugin(),
maxTemp: new MaxPlugin('temperature'),
minTemp: new MinPlugin('temperature'),
@ -67,7 +67,7 @@ describe('Multiple Plugins Integration', () => {
// Add some temperature readings
const readings = [22, 25, 18, 30, 20];
readings.forEach((temp, index) => {
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('sensor1', 'host1')
.withTimestamp(1000 + index * 1000)
.setProperty('room1', 'temperature', temp, 'sensors')
@ -89,7 +89,7 @@ describe('Multiple Plugins Integration', () => {
});
test('should handle multiple entities with different plugins', () => {
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
name: new LastWriteWinsPlugin(),
tags: new ConcatenationPlugin(),
score: new MaxPlugin()
@ -97,7 +97,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting first delta`);
// Add data for entity1
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'name', 'Test Entity', 'test-name')
@ -107,7 +107,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting second delta`);
// Add more tags to entity1
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(2000)
.setProperty('entity1', 'tags', 'tag2', 'test')
@ -116,7 +116,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting third delta`);
// Add data for entity2
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity2', 'score', 85, 'test')
@ -125,7 +125,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting fourth delta`);
// Update score for entity2
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(2000)
.setProperty('entity2', 'score', 90, 'test')

View File

@ -1,5 +1,5 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Lossless, createDelta, CollapsedDelta } from '@src';
import { RhizomeNode, Hyperview, createDelta, CollapsedDelta } from '@src';
import {
CustomResolver,
ResolverPlugin,
@ -51,20 +51,20 @@ type LifecycleTestState = {
describe('Plugin Lifecycle', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
test('should call initialize, update, and resolve in order', () => {
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
test: new LifecycleTestPlugin()
});
// Add some data
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test1', 'test', 'value1')
@ -92,12 +92,12 @@ describe('Plugin Lifecycle', () => {
});
test('should handle multiple updates correctly', () => {
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
test: new LifecycleTestPlugin()
});
// First update
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test2', 'test', 'value1')
@ -105,7 +105,7 @@ describe('Plugin Lifecycle', () => {
);
// Second update
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(2000)
.setProperty('test2', 'test', 'value2')
@ -132,7 +132,7 @@ describe('Plugin Lifecycle', () => {
});
test('should handle empty state', () => {
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
test: new LifecycleTestPlugin()
});

View File

@ -1,7 +1,7 @@
import { describe, test, expect } from '@jest/globals';
import { ResolverPlugin, DependencyStates } from '@src/views/resolvers/custom-resolvers';
import { PropertyTypes } from '@src/core/types';
import type { CollapsedDelta } from '@src/views/lossless';
import type { CollapsedDelta } from '@src/views/hyperview';
import { testResolverWithPlugins, createTestDelta } from '@test-helpers/resolver-test-helper';
class CountPlugin extends ResolverPlugin<{ count: number }> {

View File

@ -1,6 +1,6 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode } from '@src';
import { Lossless } from '@src/views/lossless';
import { Hyperview } from '@src/views/hyperview';
import { CustomResolver } from '@src/views/resolvers/custom-resolvers';
import { ResolverPlugin } from '@src/views/resolvers/custom-resolvers/plugin';
// import Debug from 'debug';
@ -25,11 +25,11 @@ class TestPlugin extends ResolverPlugin<unknown> {
describe('CustomResolver', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
describe('buildDependencyGraph', () => {
@ -42,7 +42,7 @@ describe('CustomResolver', () => {
};
// Act
const resolver = new CustomResolver(lossless, plugins);
const resolver = new CustomResolver(hyperview, plugins);
const graph = resolver.dependencyGraph;
@ -65,7 +65,7 @@ describe('CustomResolver', () => {
};
// Act
const resolver = new CustomResolver(lossless, plugins);
const resolver = new CustomResolver(hyperview, plugins);
// Access private method for testing
const graph = resolver.dependencyGraph;
@ -86,7 +86,7 @@ describe('CustomResolver', () => {
// Act & Assert
expect(() => {
new CustomResolver(lossless, plugins);
new CustomResolver(hyperview, plugins);
}).toThrow('Dependency nonexistent not found for plugin a');
});
@ -99,7 +99,7 @@ describe('CustomResolver', () => {
};
// Act
const resolver = new CustomResolver(lossless, plugins);
const resolver = new CustomResolver(hyperview, plugins);
// Access private method for testing
const graph = resolver.dependencyGraph;
@ -125,7 +125,7 @@ describe('CustomResolver', () => {
// Act & Assert
expect(() => {
new CustomResolver(lossless, plugins);
new CustomResolver(hyperview, plugins);
}).toThrow('Circular dependency detected in plugin dependencies');
});
});

View File

@ -1,6 +1,6 @@
import Debug from "debug";
import { createDelta } from '@src/core/delta-builder';
import { Lossless, RhizomeNode } from '@src';
import { Hyperview, RhizomeNode } from '@src';
import { TimestampResolver } from '@src/views/resolvers/timestamp-resolvers';
const debug = Debug('rz:test:last-write-wins');
@ -11,24 +11,24 @@ describe('Last write wins', () => {
describe('given that two separate writes occur', () => {
const node = new RhizomeNode();
const lossless = new Lossless(node);
const hyperview = new Hyperview(node);
const lossy = new TimestampResolver(lossless);
const view = new TimestampResolver(hyperview);
beforeAll(() => {
lossless.ingestDelta(createDelta('a', 'h')
hyperview.ingestDelta(createDelta('a', 'h')
.setProperty('broccoli', 'want', 95, 'vegetable')
.buildV1()
);
lossless.ingestDelta(createDelta('a', 'h')
hyperview.ingestDelta(createDelta('a', 'h')
.setProperty('broccoli', 'want', 90, 'vegetable')
.buildV1()
);
});
test('our resolver should return the most recently written value', () => {
const result = lossy.resolve(["broccoli"]);
const result = view.resolve(["broccoli"]);
debug('result', result);
expect(result).toMatchObject({
broccoli: {

View File

@ -1,5 +1,5 @@
import { RhizomeNode, Lossless, createDelta } from "@src";
import { CollapsedDelta } from "@src/views/lossless";
import { RhizomeNode, Hyperview, createDelta } from "@src";
import { CollapsedDelta } from "@src/views/hyperview";
import {
CustomResolver,
ResolverPlugin,
@ -10,11 +10,11 @@ import { PropertyTypes } from '@src/core/types';
describe('State Visibility', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
// A test plugin that records which states it sees
@ -88,10 +88,10 @@ describe('State Visibility', () => {
prop2: spy2
} as const;
const resolver = new CustomResolver(lossless, config);
const resolver = new CustomResolver(hyperview, config);
// Add some data
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'prop1', 'value1', 'entity-prop1')
@ -127,10 +127,10 @@ describe('State Visibility', () => {
dependsOn: dependency
} as const;
const resolver = new CustomResolver(lossless, config);
const resolver = new CustomResolver(hyperview, config);
// Add some data
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'dependsOn', 'baseValue', 'prop1')
@ -157,14 +157,14 @@ describe('State Visibility', () => {
const lastWrite = new LastWriteWinsPlugin();
const other = new LastWriteWinsPlugin();
const resolver = new CustomResolver(lossless, {
const resolver = new CustomResolver(hyperview, {
dependent: dependent,
dependsOn: lastWrite,
other: other // Not declared as a dependency
});
// Add some data
lossless.ingestDelta(
hyperview.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'dependsOn', 'baseValue', 'prop1')
@ -214,7 +214,7 @@ describe('State Visibility', () => {
}
expect(() => {
new CustomResolver(lossless, {
new CustomResolver(hyperview, {
bad: new PluginWithBadDeps()
});
}).toThrow("Dependency nonexistent not found for plugin bad");

View File

@ -1,6 +1,6 @@
import {
RhizomeNode,
Lossless,
Hyperview,
TimestampResolver,
CreatorIdTimestampResolver,
DeltaIdTimestampResolver,
@ -13,19 +13,19 @@ const debug = Debug('rz:test:timestamp-resolvers');
describe('Timestamp Resolvers', () => {
let node: RhizomeNode;
let lossless: Lossless;
let hyperview: Hyperview;
beforeEach(() => {
node = new RhizomeNode();
lossless = new Lossless(node);
hyperview = new Hyperview(node);
});
describe('Basic Timestamp Resolution', () => {
test('should resolve by most recent timestamp', () => {
const resolver = new TimestampResolver(lossless);
const resolver = new TimestampResolver(hyperview);
// Add older delta
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -34,7 +34,7 @@ describe('Timestamp Resolvers', () => {
);
// Add newer delta
lossless.ingestDelta(createDelta('user2', 'host2')
hyperview.ingestDelta(createDelta('user2', 'host2')
.withId('delta2')
.withTimestamp(2000)
.addPointer('collection', 'entity1', 'score')
@ -50,10 +50,10 @@ describe('Timestamp Resolvers', () => {
});
test('should handle multiple entities with different timestamps', () => {
const resolver = new TimestampResolver(lossless);
const resolver = new TimestampResolver(hyperview);
// Entity1 - older value
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'value')
.addPointer('value', 100)
@ -61,7 +61,7 @@ describe('Timestamp Resolvers', () => {
);
// Entity2 - newer value
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withTimestamp(2000)
.addPointer('collection', 'entity2', 'value')
.addPointer('value', 200)
@ -78,10 +78,10 @@ describe('Timestamp Resolvers', () => {
describe('Tie-Breaking Strategies', () => {
test('should break ties using creator-id strategy', () => {
const resolver = new CreatorIdTimestampResolver(lossless);
const resolver = new CreatorIdTimestampResolver(hyperview);
// Two deltas with same timestamp, different creators
lossless.ingestDelta(createDelta('user_z', 'host1')
hyperview.ingestDelta(createDelta('user_z', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -89,7 +89,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
lossless.ingestDelta(createDelta('user_a', 'host1')
hyperview.ingestDelta(createDelta('user_a', 'host1')
.withId('delta2')
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score')
@ -105,10 +105,10 @@ describe('Timestamp Resolvers', () => {
});
test('should break ties using delta-id strategy', () => {
const resolver = new DeltaIdTimestampResolver(lossless);
const resolver = new DeltaIdTimestampResolver(hyperview);
// Two deltas with same timestamp, different delta IDs
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta_a') // Lexicographically earlier
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -116,7 +116,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta_z') // Lexicographically later
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score')
@ -132,10 +132,10 @@ describe('Timestamp Resolvers', () => {
});
test('should break ties using host-id strategy', () => {
const resolver = new HostIdTimestampResolver(lossless);
const resolver = new HostIdTimestampResolver(hyperview);
// Two deltas with same timestamp, different hosts
lossless.ingestDelta(createDelta('user1', 'host_z') // Lexicographically later
hyperview.ingestDelta(createDelta('user1', 'host_z') // Lexicographically later
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -143,7 +143,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
lossless.ingestDelta(createDelta('user1', 'host_a') // Lexicographically earlier
hyperview.ingestDelta(createDelta('user1', 'host_a') // Lexicographically earlier
.withId('delta2')
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score')
@ -159,10 +159,10 @@ describe('Timestamp Resolvers', () => {
});
test('should break ties using lexicographic strategy with string values', () => {
const resolver = new LexicographicTimestampResolver(lossless);
const resolver = new LexicographicTimestampResolver(hyperview);
// Two deltas with same timestamp, different string values
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'name')
@ -170,7 +170,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta2')
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'name')
@ -186,10 +186,10 @@ describe('Timestamp Resolvers', () => {
});
test('should break ties using lexicographic strategy with numeric values (falls back to delta ID)', () => {
const resolver = new LexicographicTimestampResolver(lossless);
const resolver = new LexicographicTimestampResolver(hyperview);
// Two deltas with same timestamp, numeric values (should fall back to delta ID comparison)
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta_a') // Lexicographically earlier
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -197,7 +197,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta_z') // Lexicographically later
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score')
@ -215,11 +215,11 @@ describe('Timestamp Resolvers', () => {
describe('Complex Tie-Breaking Scenarios', () => {
test('should handle multiple properties with different tie-breaking outcomes', () => {
const creatorResolver = new CreatorIdTimestampResolver(lossless);
const deltaResolver = new DeltaIdTimestampResolver(lossless);
const creatorResolver = new CreatorIdTimestampResolver(hyperview);
const deltaResolver = new DeltaIdTimestampResolver(hyperview);
// Add deltas for multiple properties with same timestamp
lossless.ingestDelta(createDelta('user_a', 'host1')
hyperview.ingestDelta(createDelta('user_a', 'host1')
.withId('delta_z')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'name')
@ -227,7 +227,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
lossless.ingestDelta(createDelta('user_z', 'host1')
hyperview.ingestDelta(createDelta('user_z', 'host1')
.withId('delta_a')
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'name')
@ -249,10 +249,10 @@ describe('Timestamp Resolvers', () => {
});
test('should work consistently with timestamp priority over tie-breaking', () => {
const resolver = new CreatorIdTimestampResolver(lossless);
const resolver = new CreatorIdTimestampResolver(hyperview);
// Add older delta with "better" tie-breaking attributes
lossless.ingestDelta(createDelta('user_z', 'host1')
hyperview.ingestDelta(createDelta('user_z', 'host1')
.withId('delta_z') // Would win in delta ID tie-breaking
.withTimestamp(1000) // Older timestamp
.addPointer('collection', 'entity1', 'score')
@ -261,7 +261,7 @@ describe('Timestamp Resolvers', () => {
);
// Add newer delta with "worse" tie-breaking attributes
lossless.ingestDelta(createDelta('user_a', 'host1')
hyperview.ingestDelta(createDelta('user_a', 'host1')
.withId('delta_a') // Would lose in delta ID tie-breaking
.withTimestamp(2000) // Newer timestamp
.addPointer('collection', 'entity1', 'score')
@ -279,8 +279,8 @@ describe('Timestamp Resolvers', () => {
describe('Edge Cases', () => {
test('should handle single delta correctly', () => {
const resolver = new TimestampResolver(lossless, 'creator-id');
lossless.ingestDelta(createDelta('user1', 'host1')
const resolver = new TimestampResolver(hyperview, 'creator-id');
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'value')
@ -295,9 +295,9 @@ describe('Timestamp Resolvers', () => {
});
test('should handle mixed value types correctly', () => {
const resolver = new TimestampResolver(lossless);
const resolver = new TimestampResolver(hyperview);
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'name')
@ -305,7 +305,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
lossless.ingestDelta(createDelta('user1', 'host1')
hyperview.ingestDelta(createDelta('user1', 'host1')
.withId('delta2')
.withTimestamp(1001)
.addPointer('collection', 'entity1', 'score')

View File

@ -11,7 +11,7 @@ classDiagram
-requestReply: RequestReply
-httpServer: HttpServer
-deltaStream: DeltaStream
-lossless: Lossless
-hyperview: Hyperview
-peers: Peers
-queryEngine: QueryEngine
-storageQueryEngine: StorageQueryEngine
@ -27,15 +27,15 @@ classDiagram
+pointers: PointerV1[]
}
class Lossless {
-domainEntities: Map<DomainEntityID, LosslessEntity>
class Hyperview {
-domainEntities: Map<DomainEntityID, HyperviewEntity>
-transactions: Transactions
+view(ids: DomainEntityID[]): LosslessViewMany
+compose(ids: DomainEntityID[]): LosslessViewMany
+view(ids: DomainEntityID[]): HyperviewViewMany
+compose(ids: DomainEntityID[]): HyperviewViewMany
}
class QueryEngine {
-lossless: Lossless
-hyperview: Hyperview
-schemaRegistry: SchemaRegistry
+query(schemaId: SchemaID, filter?: JsonLogic): Promise<SchemaAppliedViewWithNesting[]>
}
@ -69,23 +69,23 @@ classDiagram
%% Relationships
RhizomeNode --> DeltaStream
RhizomeNode --> Lossless
RhizomeNode --> Hyperview
RhizomeNode --> QueryEngine
RhizomeNode --> StorageQueryEngine
RhizomeNode --> SchemaRegistry
RhizomeNode --> DeltaStorage
Lossless --> Transactions
Lossless --> LosslessEntity
Hyperview --> Transactions
Hyperview --> HyperviewEntity
QueryEngine --> SchemaRegistry
QueryEngine --> Lossless
QueryEngine --> Hyperview
StorageQueryEngine --> DeltaStorage
StorageQueryEngine --> SchemaRegistry
DeltaStream --> Delta
Lossless --> Delta
Hyperview --> Delta
DockerOrchestrator --> ContainerManager
DockerOrchestrator --> NetworkManager
@ -103,7 +103,7 @@ classDiagram
- Represents atomic changes in the system
- Contains pointers to entities and their properties
3. **Lossless**: Manages the lossless view of data
3. **Hyperview**: Manages the hyperview view of data
- Maintains the complete history of deltas
- Provides methods to view and compose entity states
@ -126,7 +126,7 @@ classDiagram
## Data Flow
1. Deltas are received through the DeltaStream
2. Lossless processes and stores these deltas
2. Hyperview processes and stores these deltas
3. Queries can be made through either QueryEngine (in-memory) or StorageQueryEngine (persisted)
4. The system maintains consistency through the schema system
5. In distributed mode, DockerOrchestrator manages multiple node instances

View File

@ -10,11 +10,11 @@ The `CustomResolver` class is the main entry point for the Custom Resolver syste
class CustomResolver {
/**
* Creates a new CustomResolver instance
* @param view The lossless view to resolve
* @param view The hyperview view to resolve
* @param config Plugin configuration
*/
constructor(
private readonly view: LosslessView,
private readonly view: HyperviewView,
private readonly config: ResolverConfig
);
@ -48,7 +48,7 @@ class CustomResolver {
Creates a new instance of the CustomResolver.
**Parameters:**
- `view: LosslessView` - The lossless view containing the data to resolve
- `view: HyperviewView` - The hyperview view containing the data to resolve
- `config: ResolverConfig` - Configuration object mapping property IDs to their resolver plugins
**Example:**
@ -146,10 +146,10 @@ The resolver may throw the following errors:
```typescript
import { CustomResolver, LastWriteWinsPlugin } from './resolver';
import { LosslessView } from '../lossless-view';
import { HyperviewView } from '../hyperview-view';
// Create a lossless view with some data
const view = new LosslessView();
// Create a hyperview view with some data
const view = new HyperviewView();
// ... add data to the view ...
// Configure the resolver

View File

@ -22,22 +22,26 @@ The `CustomResolver` system provides a flexible framework for resolving property
## Getting Started
```typescript
import { CustomResolver, LastWriteWinsPlugin } from './resolver';
import { LosslessView } from '../lossless-view';
import { CustomResolver } from '../src/views/resolvers/custom-resolvers';
import { LastWriteWinsPlugin } from '../src/views/resolvers/custom-resolvers/plugins';
import { Hyperview } from '../src/views/hyperview';
// Create a lossless view
const view = new LosslessView();
// Create a hyperview view
const hyperview = new Hyperview();
// Create a resolver with a last-write-wins strategy
const resolver = new CustomResolver(view, {
const resolver = new CustomResolver(hyperview, {
myProperty: new LastWriteWinsPlugin()
});
// Process updates
// ...
// Process updates through the hyperview view
// hyperview.applyDelta(delta);
// Get resolved values
const result = resolver.resolve();
// Get resolved values for specific entities
const result = resolver.resolve(['entity1', 'entity2']);
// Or get all resolved values
const allResults = resolver.resolveAll();
```
## Next Steps

View File

@ -78,11 +78,11 @@ Create tests to verify your plugin's behavior:
```typescript
describe('DiscountedPricePlugin', () => {
let view: LosslessView;
let view: HyperviewView;
let resolver: CustomResolver;
beforeEach(() => {
view = new LosslessView();
view = new HyperviewView();
resolver = new CustomResolver(view, {
basePrice: new LastWriteWinsPlugin(),
discount: new LastWriteWinsPlugin(),

View File

@ -1,15 +1,150 @@
# Resolvers (Views)
# Views and Resolvers
The workhorse of this system is likely going to be our lossy views.
This is where the computation likely generally occurs.
## Core Concepts
So, let's talk about how to create a view.
### Views
A lossy view initializes from a given lossless view.
The lossless view dispatches events when entity properties are updated.
A `View` (previously known as `Lossy`) is a derived, computed representation of your data that provides efficient access to resolved values. It's built on top of the `Hyperview` (previously `Hyperview`) storage layer, which maintains the complete history of all deltas.
View semantics are similar to map-reduce, resolvers in Redux, etc.
```typescript
// Basic View implementation
export abstract class View<Accumulator, Result = Accumulator> {
// Core methods
abstract reducer(acc: Accumulator, cur: HyperviewOne): Accumulator;
resolver?(acc: Accumulator, entityIds: DomainEntityID[]): Result;
// Built-in functionality
resolve(entityIds?: DomainEntityID[]): Result | undefined;
}
```
The key is to identify your accumulator object.
Your algorithm SHOULD be implemented so that the reducer is a pure function.
All state must therefore be stored in the accumulator.
### Hyperview
`Hyperview` (previously `Hyperview`) maintains the complete, immutable history of all deltas and provides the foundation for building derived views.
```typescript
// Basic Hyperview usage
const hyperview = new Hyperview(rhizomeNode);
const view = new MyCustomView(hyperview);
const result = view.resolve(['entity1', 'entity2']);
```
## Creating Custom Resolvers
### Basic Resolver Pattern
1. **Define your accumulator type**: This holds the state of your view
2. **Implement the reducer**: Pure function that updates the accumulator
3. **Optionally implement resolver**: Transforms the accumulator into the final result
```typescript
class CountResolver extends View<number> {
reducer(acc: number, view: HyperviewOne): number {
return acc + Object.keys(view.propertyDeltas).length;
}
initializer() { return 0; }
}
```
### Custom Resolver with Dependencies
Resolvers can depend on other resolvers using the `CustomResolver` system:
```typescript
// Define resolver plugins
const resolver = new CustomResolver(hyperview, {
totalItems: new CountPlugin(),
average: new AveragePlugin()
});
// Get resolved values
const result = resolver.resolve(['entity1', 'entity2']);
```
## Built-in Resolvers
### Last Write Wins
Keeps the most recent value based on timestamp:
```typescript
const resolver = new CustomResolver(hyperview, {
status: new LastWriteWinsPlugin()
});
```
### First Write Wins
Keeps the first value that was written:
```typescript
const resolver = new CustomResolver(hyperview, {
initialStatus: new FirstWriteWinsPlugin()
});
```
### Aggregation Resolvers
Built-in aggregators for common operations:
- `SumPlugin`: Sum of numeric values
- `AveragePlugin`: Average of numeric values
- `CountPlugin`: Count of values
- `MinPlugin`: Minimum value
- `MaxPlugin`: Maximum value
## Advanced Topics
### Performance Considerations
1. **Efficient Reducers**: Keep reducers pure and fast
2. **Selective Updates**: Only process changed entities
3. **Memoization**: Cache expensive computations
### Common Patterns
1. **Derived Properties**: Calculate values based on other properties
2. **State Machines**: Track state transitions over time
3. **Validation**: Enforce data consistency rules
## Best Practices
1. **Keep Reducers Pure**: Avoid side effects in reducers
2. **Use Strong Typing**: Leverage TypeScript for type safety
3. **Handle Edge Cases**: Consider empty states and error conditions
4. **Profile Performance**: Monitor and optimize hot paths
## Examples
### Simple Counter
```typescript
class CounterView extends View<number> {
reducer(acc: number, view: HyperviewOne): number {
return acc + view.propertyDeltas['increment']?.length || 0;
}
initializer() { return 0; }
}
```
### Running Average
```typescript
class RunningAverageView extends View<{sum: number, count: number}, number> {
reducer(acc: {sum: number, count: number}, view: HyperviewOne): {sum: number, count: number} {
const value = // extract value from view
return {
sum: acc.sum + value,
count: acc.count + 1
};
}
resolver(acc: {sum: number, count: number}): number {
return acc.count > 0 ? acc.sum / acc.count : 0;
}
initializer() { return {sum: 0, count: 0}; }
}
```

View File

@ -56,7 +56,7 @@ const unsafeDelta = createDelta('peer1', 'peer1')
.buildV1();
// This will be ingested without validation
node.lossless.ingestDelta(unsafeDelta);
node.hyperview.ingestDelta(unsafeDelta);
// 5. Check validation status after the fact
const stats = collection.getValidationStats();

View File

@ -84,9 +84,9 @@ const delta = createTestDelta('user1', 'host1')
## How It Works
1. Creates a new `Lossless` instance for the test
1. Creates a new `Hyperview` instance for the test
2. Sets up a `CustomResolver` with the provided plugins
3. Ingests all provided deltas into the `Lossless` instance
3. Ingests all provided deltas into the `Hyperview` instance
4. Retrieves a view for the specified entity
5. Processes the view through the resolver
6. Calls the `expectedResult` callback with the resolved entity
@ -97,4 +97,4 @@ const delta = createTestDelta('user1', 'host1')
- The helper handles all setup and teardown of test resources
- Use `createTestDelta` for consistent delta creation in tests
- The helper ensures type safety between the resolver and the expected result type
- Each test gets a fresh `Lossless` instance automatically
- Each test gets a fresh `Hyperview` instance automatically

View File

@ -18,8 +18,8 @@ type User = {
(async () => {
const rhizomeNode = new RhizomeNode();
// Enable API to read lossless view
rhizomeNode.httpServer.httpApi.serveLossless();
// Enable API to read hyperview view
rhizomeNode.httpServer.httpApi.serveHyperview();
const users = new BasicCollection("user");
users.rhizomeConnect(rhizomeNode);

View File

@ -136,10 +136,10 @@ smalltalk type messaging structure on top of the database
note dimensions of attack surface
layers:
primitives - what's a delta, schema, materialized view, lossy view
primitives - what's a delta, schema, materialized view, view view
delta store - allows you to persiste deltas and query over the delta stream
materialized view store - lossy snapshot(s)
lossy bindings - e.g. graphql, define what a user looks like, that gets application bindings
materialized view store - view snapshot(s)
view bindings - e.g. graphql, define what a user looks like, that gets application bindings
e.g. where your resolvers come in, so that your fields aren't all arrays, i.e. you resolve conflicts
-- idea: diff tools

View File

@ -34,10 +34,10 @@ Pre-built technologies? LevelDB
Can use LevelDB to store deltas
Structure can correspond to our desired ontology
Layers for
- primitives - what's a delta, schema, materialized view, lossy view
- primitives - what's a delta, schema, materialized view, view view
- delta store - allows you to persiste deltas and query over the delta stream
- materialized view store - lossy snapshot(s)
- lossy bindings - e.g. graphql, define what a user looks like, that gets application bindings
- materialized view store - view snapshot(s)
- view bindings - e.g. graphql, define what a user looks like, that gets application bindings
e.g. where your resolvers come in, so that your fields aren't all arrays, i.e. you resolve conflicts
Protocol involving ZeroMQ--

View File

@ -1,11 +1,11 @@
> myk:
> I think so far this seems mostly on point, but I'd focus on building the bridge between Domain Entity (lossy representation) <-> Lossless Representation <-> Delta[] I think
> I think so far this seems mostly on point, but I'd focus on building the bridge between Domain Entity (view representation) <-> Hyperview Representation <-> Delta[] I think
> the tricky stuff comes in with, like, how do you take an undifferentiated stream of deltas, a query and a schema
> and filter / merge those deltas into the lossless tree structure you need in order to then reduce into a lossy domain node
> and filter / merge those deltas into the hyperview tree structure you need in order to then reduce into a view domain node
> if that part of the flow works then the rest becomes fairly self-evident
> a "lossless representation" is basically a DAG/Tree that starts with a root node whose properties each contain the deltas that assign values to them, where the delta may have a pointer up to "this" and then down to some related domain node, which gets interpolated into the tree instead of just referenced, and it has its properties contain the deltas that target it, etc
> a "hyperview representation" is basically a DAG/Tree that starts with a root node whose properties each contain the deltas that assign values to them, where the delta may have a pointer up to "this" and then down to some related domain node, which gets interpolated into the tree instead of just referenced, and it has its properties contain the deltas that target it, etc
> so you need both the ID of the root node (the thing being targeted by one or more deltas) as well as the scehma to apply to determine which contexts on that target to include (target_context effectively becomes a property on the domain entity, right?), as well as which schema to apply to included referenced entities, etc.
> so it's what keeps you from returning the whole stream of deltas, while still allowing you to follow arbitrary edges
@ -31,7 +31,7 @@ Example delta:
target: "usd"
}]
Lossless transformation:
Hyperview transformation:
{
keanu: {

View File

@ -9,7 +9,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
1. **Deltas are relationships**: Every delta with pointers already expresses relationships
2. **Patterns, not structure**: We're recognizing patterns in how deltas connect entities
3. **Perspective-driven**: Different views/resolvers can interpret the same deltas differently
4. **No single truth**: Competing deltas are resolved by application-level lossy resolvers
4. **No single truth**: Competing deltas are resolved by application-level view resolvers
5. **Time-aware**: All queries are inherently temporal, showing different relationships at different times
## Current State ✅
@ -17,7 +17,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
- **All tests passing**: 21/21 suites, 183/183 tests (100%)
- **Delta system**: Fully functional with pointers expressing relationships
- **Negation system**: Can invalidate deltas (and thus relationships)
- **Query system**: Basic traversal of lossless views
- **Query system**: Basic traversal of hyperview views
- **Schema system**: Can describe entity structures
- **Resolver system**: Application-level interpretation of deltas
@ -120,7 +120,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
class PatternResolver {
// Interpret deltas matching certain patterns
resolveWithPatterns(entityId, patterns) {
const deltas = this.lossless.getDeltasForEntity(entityId);
const deltas = this.hyperview.getDeltasForEntity(entityId);
return {
entity: entityId,

View File

@ -2,7 +2,7 @@
## Executive Summary
The rhizome-node implementation demonstrates strong alignment with core spec concepts but lacks implementation and testing for several advanced features. The fundamental delta → lossless → lossy transformation pipeline is well-implemented, while query systems, relational features, and advanced conflict resolution remain unimplemented.
The rhizome-node implementation demonstrates strong alignment with core spec concepts but lacks implementation and testing for several advanced features. The fundamental delta → hyperview → view transformation pipeline is well-implemented, while query systems, relational features, and advanced conflict resolution remain unimplemented.
## Core Concept Alignment
@ -13,13 +13,13 @@ The rhizome-node implementation demonstrates strong alignment with core spec con
- **Implementation**: Correctly implements both V1 (array) and V2 (object) formats
- **Tests**: Basic format conversion tested, but validation gaps exist
2. **Lossless Views**
2. **Hyperview Views**
- **Spec**: Full inventory of all deltas composing an object
- **Implementation**: `LosslessViewDomain` correctly accumulates deltas by entity/property
- **Implementation**: `HyperviewViewDomain` correctly accumulates deltas by entity/property
- **Tests**: Good coverage of basic transformation, filtering by creator/host
3. **Lossy Views**
- **Spec**: Compression of lossless views using resolution strategies
- **Spec**: Compression of hyperview views using resolution strategies
- **Implementation**: Initializer/reducer/resolver pattern provides flexibility
- **Tests**: Domain-specific example (Role/Actor/Film) demonstrates concept
@ -127,4 +127,4 @@ The rhizome-node implementation demonstrates strong alignment with core spec con
## Conclusion
The rhizome-node implementation successfully captures the core concepts of the spec but requires significant work to achieve full compliance. The foundation is solid, with the delta/lossless/lossy pipeline working as designed. However, advanced features like queries, schemas, and sophisticated conflict resolution remain unimplemented. The test suite would benefit from expanded coverage of edge cases, validation, and distributed system scenarios.
The rhizome-node implementation successfully captures the core concepts of the spec but requires significant work to achieve full compliance. The foundation is solid, with the delta/hyperview/view pipeline working as designed. However, advanced features like queries, schemas, and sophisticated conflict resolution remain unimplemented. The test suite would benefit from expanded coverage of edge cases, validation, and distributed system scenarios.

14
spec.md
View File

@ -8,11 +8,11 @@
* a `primitive` is a literal string, number or boolean value whose meaning is not tied up in its being a reference to a larger whole.
* An `object` is a composite object whose entire existence is encoded as the set of deltas that reference it. An object is identified by a unique `reference`, and every delta that includes that `reference` is asserting a claim about some property of that object.
* A `negation` is a specific kind of delta that includes a pointer with the name `negates`, a `target` reference to another delta, and a `context` called `negated_by`.
* a `schema` represents a template by which an `object` can be compiled into a `lossless view`. A schema specifies which properties of that object are included, and it specifies schemas for the objects references by the deltas within those properties. A schema must terminate in primitive schemas to avoid an infinite regress.
* For instance, a `lossless view` "User" of a user may include references to friends. If those friends are in turn encoded as instances of the "User" schema then all of *their* friends would be fully encoded, etc.
* a `schema` represents a template by which an `object` can be compiled into a `hyperview view`. A schema specifies which properties of that object are included, and it specifies schemas for the objects references by the deltas within those properties. A schema must terminate in primitive schemas to avoid an infinite regress.
* For instance, a `hyperview view` "User" of a user may include references to friends. If those friends are in turn encoded as instances of the "User" schema then all of *their* friends would be fully encoded, etc.
* This could lead to circular references and arbitrarily deep nesting, which runs into the problem of "returning the entire graph". So our schema should specify, for instance, that the "friends" field apply the "Summary" schema to referenced users rather than the "User" schema, where the "Summary" schema simply resolves to username and photo.
* A `lossless view` is a representation of an `object` that includes a full inventory of all of the deltas that compose that object. So for instance, a lossless view of the object representing the user "Alice" might include `alice.name`, which contains an array of all deltas with a pointer whose `target` is the ID of Alice and whose context is `name`. Such deltas would likely include a second pointer with the name `name` and the target a primitive string "Alice", for instance.
* A `lossless view` may also include nested delta/object layering. Consider `alice.friends`, which would include all deltas asserting friendship between Alice and some other person. Each such delta would reference a different friend object. In a lossless view, these references would be expanded to contain lossless views of those friends. Schemas, as defined above, would be applied to constrain tree depth and avoid infinite regress.
* A `lossy view` is a compression of a `lossless view` that removes delta information and flattens the structure into a standard domain object, typically in JSON. So instead of `alice.name` resolving to a list of deltas that assert the object's name it might simply resolve to `"alice"`.
* Note that in a lossless view any property of an object necessarily resolves to a set of deltas, even if it's an empty set, because we cannot anticipate how many deltas exist that assert values on that context.
* In collapsing a lossless view into a lossy view we may specify `resolution strategies` on each field of the schema. A resolution strategy takes as input the set of all deltas targeting that context and returns as output the value in the lossy view. So if we have 15 deltas asserting the value for an object's name, our resolution strategy may simply say "return the target of the `name` pointer associated with the most recent delta", or it may say "return an array of names". If the value is numeric it may say "take the max" or "take the min" or "take the average".
* A `hyperview view` is a representation of an `object` that includes a full inventory of all of the deltas that compose that object. So for instance, a hyperview view of the object representing the user "Alice" might include `alice.name`, which contains an array of all deltas with a pointer whose `target` is the ID of Alice and whose context is `name`. Such deltas would likely include a second pointer with the name `name` and the target a primitive string "Alice", for instance.
* A `hyperview view` may also include nested delta/object layering. Consider `alice.friends`, which would include all deltas asserting friendship between Alice and some other person. Each such delta would reference a different friend object. In a hyperview view, these references would be expanded to contain hyperview views of those friends. Schemas, as defined above, would be applied to constrain tree depth and avoid infinite regress.
* A `view view` is a compression of a `hyperview view` that removes delta information and flattens the structure into a standard domain object, typically in JSON. So instead of `alice.name` resolving to a list of deltas that assert the object's name it might simply resolve to `"alice"`.
* Note that in a hyperview view any property of an object necessarily resolves to a set of deltas, even if it's an empty set, because we cannot anticipate how many deltas exist that assert values on that context.
* In collapsing a hyperview view into a view view we may specify `resolution strategies` on each field of the schema. A resolution strategy takes as input the set of all deltas targeting that context and returns as output the value in the view view. So if we have 15 deltas asserting the value for an object's name, our resolution strategy may simply say "return the target of the `name` pointer associated with the most recent delta", or it may say "return an array of names". If the value is numeric it may say "take the max" or "take the min" or "take the average".

View File

@ -18,7 +18,7 @@ export abstract class Collection<View> {
rhizomeNode?: RhizomeNode;
name: string;
eventStream = new EventEmitter();
lossy?: View;
view?: View;
constructor(name: string) {
this.name = name;
@ -34,7 +34,7 @@ export abstract class Collection<View> {
this.initializeView();
// Listen for completed transactions, and emit updates to event stream
this.rhizomeNode.lossless.eventStream.on("updated", (id) => {
this.rhizomeNode.hyperview.eventStream.on("updated", (id) => {
// TODO: Filter so we only get members of our collection
// TODO: Reslover / Delta Filter?
@ -128,7 +128,7 @@ export abstract class Collection<View> {
}
debug(`Getting ids for collection ${this.name}`)
const ids = new Set<string>();
for (const [entityId, names] of this.rhizomeNode.lossless.referencedAs.entries()) {
for (const [entityId, names] of this.rhizomeNode.hyperview.referencedAs.entries()) {
if (names.has(this.name)) {
ids.add(entityId);
}
@ -160,7 +160,7 @@ export abstract class Collection<View> {
);
const ingested = new Promise<boolean>((resolve) => {
this.rhizomeNode!.lossless.eventStream.on("updated", (id: DomainEntityID) => {
this.rhizomeNode!.hyperview.eventStream.on("updated", (id: DomainEntityID) => {
if (id === entityId) resolve(true);
})
});
@ -178,7 +178,7 @@ export abstract class Collection<View> {
await this.rhizomeNode!.deltaStream.publishDelta(delta);
// ingest the delta as though we had received it from a peer
this.rhizomeNode!.lossless.ingestDelta(delta);
this.rhizomeNode!.hyperview.ingestDelta(delta);
});
// Return updated view of this entity

View File

@ -7,20 +7,20 @@ import {Collection} from '../collections/collection-abstract';
import {TimestampResolver} from '../views/resolvers/timestamp-resolvers';
export class BasicCollection extends Collection<TimestampResolver> {
declare lossy?: TimestampResolver;
declare view?: TimestampResolver;
initializeView() {
if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.lossy = new TimestampResolver(this.rhizomeNode.lossless);
this.view = new TimestampResolver(this.rhizomeNode.hyperview);
}
resolve(
id: string
) {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.lossy) throw new Error('lossy view not initialized');
if (!this.view) throw new Error('view view not initialized');
const res = this.lossy.resolve([id]) || {};
const res = this.view.resolve([id]) || {};
return res[id];
}

View File

@ -6,20 +6,20 @@ class RelationalView extends TimestampResolver {
}
export class RelationalCollection extends Collection<RelationalView> {
declare lossy?: RelationalView;
declare view?: RelationalView;
initializeView() {
if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.lossy = new RelationalView(this.rhizomeNode.lossless);
this.view = new RelationalView(this.rhizomeNode.hyperview);
}
resolve(
id: string
): ResolvedViewOne | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.lossy) throw new Error('lossy view not initialized');
if (!this.view) throw new Error('view view not initialized');
const res = this.lossy.resolve([id]) || {};
const res = this.view.resolve([id]) || {};
return res[id];
}

View File

@ -10,7 +10,7 @@ import {
SchemaApplicationOptions
} from '../schema/schema';
import { DefaultSchemaRegistry } from '../schema/schema-registry';
import { LosslessViewOne } from '../views/lossless';
import { HyperviewViewOne } from '../views/hyperview';
import { DomainEntityID } from '../core/types';
import { EntityProperties } from '../core/entity';
import { createDelta } from '@src/core';
@ -58,38 +58,38 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
initializeView(): void {
if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.lossy = new TimestampResolver(this.rhizomeNode.lossless);
this.view = new TimestampResolver(this.rhizomeNode.hyperview);
}
resolve(id: string): ResolvedViewOne | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.lossy) throw new Error('lossy view not initialized');
if (!this.view) throw new Error('view view not initialized');
const res = this.lossy.resolve([id]) || {};
const res = this.view.resolve([id]) || {};
return res[id];
}
// Validate an entity against the schema
validate(entity: T): SchemaValidationResult {
// Convert entity to a mock lossless view for validation
const mockLosslessView: LosslessViewOne = {
// Convert entity to a mock hyperview view for validation
const mockHyperviewView: HyperviewViewOne = {
id: 'validation-mock',
referencedAs: [],
propertyDeltas: {},
};
for (const [key, value] of Object.entries(entity)) {
mockLosslessView.propertyDeltas[key] = [createDelta('validation', 'validation')
mockHyperviewView.propertyDeltas[key] = [createDelta('validation', 'validation')
.addPointer(key, value as string)
.buildV1(),
];
}
return this.schemaRegistry.validate('validation-mock', this.schema.id, mockLosslessView);
return this.schemaRegistry.validate('validation-mock', this.schema.id, mockHyperviewView);
}
// Apply schema to a lossless view
apply(view: LosslessViewOne): SchemaAppliedView {
// Apply schema to a hyperview view
apply(view: HyperviewViewOne): SchemaAppliedView {
return this.schemaRegistry.applySchema(view, this.schema.id, this.applicationOptions);
}
@ -97,10 +97,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
getValidatedView(entityId: DomainEntityID): SchemaAppliedView | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId];
if (!losslessView) return undefined;
const hyperviewView = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!hyperviewView) return undefined;
return this.apply(losslessView);
return this.apply(hyperviewView);
}
// Get all entities in this collection with schema validation
@ -169,10 +169,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
for (const entityId of entityIds) {
if (!this.rhizomeNode) continue;
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId];
if (!losslessView) continue;
const hyperviewView = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!hyperviewView) continue;
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView);
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperviewView);
if (validationResult.valid) {
stats.validEntities++;
@ -200,16 +200,16 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
debug(`No rhizome node connected`)
return [];
}
const losslessView = this.rhizomeNode.lossless.compose(this.getIds());
if (!losslessView) {
debug(`No lossless view found`)
const hyperviewView = this.rhizomeNode.hyperview.compose(this.getIds());
if (!hyperviewView) {
debug(`No hyperview view found`)
return [];
}
debug(`getValidEntities, losslessView: ${JSON.stringify(losslessView, null, 2)}`)
debug(`getValidEntities, hyperviewView: ${JSON.stringify(hyperviewView, null, 2)}`)
debug(`Validating ${this.getIds().length} entities`)
return this.getIds().filter(entityId => {
debug(`Validating entity ${entityId}`)
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView[entityId]);
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperviewView[entityId]);
debug(`Validation result for entity ${entityId}: ${JSON.stringify(validationResult)}`)
return validationResult.valid;
});
@ -221,10 +221,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
const invalid: Array<{ entityId: DomainEntityID; errors: string[] }> = [];
for (const entityId of this.getIds()) {
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId];
if (!losslessView) continue;
const hyperviewView = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!hyperviewView) continue;
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView);
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperviewView);
if (!validationResult.valid) {
invalid.push({
entityId,

View File

@ -1,24 +0,0 @@
// A delta represents an assertion from a given (perspective/POV/context).
// So we want it to be fluent to express these in the local context,
// and propagated as deltas in a configurable manner; i.e. configurable batches or immediate
// import {Delta} from '../core/types';
export class Entity {
}
export class Context {
}
export class Assertion {
}
export class View {
}
export class User {
}
export function example() {
}

View File

@ -1,5 +1,4 @@
export * from './delta';
export * from './delta-builder';
export * from './types';
export * from './context';
export { Entity } from './entity';

View File

@ -1,7 +1,7 @@
import Debug from "debug";
import EventEmitter from "events";
import {Delta, DeltaID} from "../core/delta";
import {Lossless} from "../views/lossless";
import {Hyperview} from "../views/hyperview";
import {DomainEntityID, TransactionID} from "../core/types";
const debug = Debug('rz:transactions');
@ -72,7 +72,7 @@ export class Transactions {
transactions = new Map<TransactionID, Transaction>();
eventStream = new EventEmitter();
constructor(readonly lossless: Lossless) {}
constructor(readonly hyperview: Hyperview) {}
get(id: TransactionID): Transaction | undefined {
return this.transactions.get(id);
@ -116,7 +116,7 @@ export class Transactions {
{
const {transactionId, size} = getTransactionSize(delta) || {};
if (transactionId && size) {
debug(`[${this.lossless.rhizomeNode.config.peerId}]`, `Transaction ${transactionId} has size ${size}`);
debug(`[${this.hyperview.rhizomeNode.config.peerId}]`, `Transaction ${transactionId} has size ${size}`);
this.setSize(transactionId, size as number);

View File

@ -71,8 +71,8 @@ export class HttpApi {
res.json(this.rhizomeNode.peers.peers.length);
});
// Initialize lossless and query endpoints
this.serveLossless();
// Initialize hyperview and query endpoints
this.serveHyperview();
this.serveQuery();
}
@ -116,17 +116,17 @@ export class HttpApi {
});
}
serveLossless() {
serveHyperview() {
// Get all domain entity IDs. TODO: This won't scale
this.router.get('/lossless/ids', (_req: express.Request, res: express.Response) => {
this.router.get('/hyperview/ids', (_req: express.Request, res: express.Response) => {
res.json({
ids: Array.from(this.rhizomeNode.lossless.domainEntities.keys())
ids: Array.from(this.rhizomeNode.hyperview.domainEntities.keys())
});
});
// Get all transaction IDs. TODO: This won't scale
this.router.get('/transaction/ids', (_req: express.Request, res: express.Response) => {
const set = this.rhizomeNode.lossless.referencedAs.get("_transaction");
const set = this.rhizomeNode.hyperview.referencedAs.get("_transaction");
res.json({
ids: set ? Array.from(set.values()) : []
});
@ -135,7 +135,7 @@ export class HttpApi {
// View a single transaction
this.router.get('/transaction/:id', (req: express.Request, res: express.Response) => {
const {params: {id}} = req;
const v = this.rhizomeNode.lossless.compose([id]);
const v = this.rhizomeNode.hyperview.compose([id]);
const ent = v[id];
if (!ent.referencedAs?.includes("_transaction")) {
res.status(400).json({error: "Entity is not a transaction", id});
@ -144,19 +144,19 @@ export class HttpApi {
res.json({
...ent,
isComplete: this.rhizomeNode.lossless.transactions.isComplete(id)
isComplete: this.rhizomeNode.hyperview.transactions.isComplete(id)
});
});
// Get a lossless view of a single domain entity
this.router.get('/lossless/:id', (req: express.Request, res: express.Response) => {
// Get a hyperview view of a single domain entity
this.router.get('/hyperview/:id', (req: express.Request, res: express.Response) => {
const {params: {id}} = req;
const v = this.rhizomeNode.lossless.compose([id]);
const v = this.rhizomeNode.hyperview.compose([id]);
const ent = v[id];
res.json({
...ent,
isComplete: this.rhizomeNode.lossless.transactions.isComplete(id)
isComplete: this.rhizomeNode.hyperview.transactions.isComplete(id)
});
});
}

View File

@ -2,7 +2,7 @@ import Debug from 'debug';
import {CREATOR, HTTP_API_ADDR, HTTP_API_ENABLE, HTTP_API_PORT, PEER_ID, PUBLISH_BIND_ADDR, PUBLISH_BIND_HOST, PUBLISH_BIND_PORT, REQUEST_BIND_ADDR, REQUEST_BIND_HOST, REQUEST_BIND_PORT, SEED_PEERS, STORAGE_TYPE, STORAGE_PATH} from './config';
import {DeltaStream, parseAddressList, PeerAddress, Peers, PubSub, RequestReply} from './network';
import {HttpServer} from './http/index';
import {Lossless} from './views';
import {Hyperview} from './views';
import {QueryEngine, StorageQueryEngine} from './query';
import {DefaultSchemaRegistry} from './schema';
import {DeltaQueryStorage, StorageFactory, StorageConfig} from './storage';
@ -29,7 +29,7 @@ export class RhizomeNode {
requestReply: RequestReply;
httpServer: HttpServer;
deltaStream: DeltaStream;
lossless: Lossless;
hyperview: Hyperview;
peers: Peers;
queryEngine: QueryEngine;
storageQueryEngine: StorageQueryEngine;
@ -70,14 +70,14 @@ export class RhizomeNode {
this.httpServer = new HttpServer(this);
this.deltaStream = new DeltaStream(this);
this.peers = new Peers(this);
this.lossless = new Lossless(this);
this.hyperview = new Hyperview(this);
this.schemaRegistry = new DefaultSchemaRegistry();
// Initialize storage backend
this.deltaStorage = StorageFactory.create(this.config.storage!);
// Initialize query engines (both lossless-based and storage-based)
this.queryEngine = new QueryEngine(this.lossless, this.schemaRegistry);
// Initialize query engines (both hyperview-based and storage-based)
this.queryEngine = new QueryEngine(this.hyperview, this.schemaRegistry);
this.storageQueryEngine = new StorageQueryEngine(this.deltaStorage, this.schemaRegistry);
}
@ -91,10 +91,10 @@ export class RhizomeNode {
}: { syncOnStart?: boolean } = {}): Promise<void> {
debug(`[${this.config.peerId}]`, 'Starting node (waiting for ready)...');
// Connect our lossless view to the delta stream
// Connect our hyperview view to the delta stream
this.deltaStream.subscribeDeltas(async (delta) => {
// Ingest into lossless view
this.lossless.ingestDelta(delta);
// Ingest into hyperview view
this.hyperview.ingestDelta(delta);
// Also store in persistent storage
try {
@ -150,11 +150,11 @@ export class RhizomeNode {
}
/**
* Sync existing lossless view data to persistent storage
* Sync existing hyperview view data to persistent storage
* Useful for migrating from memory-only to persistent storage
*/
async syncToStorage(): Promise<void> {
debug(`[${this.config.peerId}]`, 'Syncing lossless view to storage');
debug(`[${this.config.peerId}]`, 'Syncing hyperview view to storage');
const allDeltas = this.deltaStream.deltasAccepted;
let synced = 0;

View File

@ -2,7 +2,7 @@ import jsonLogic from 'json-logic-js';
const { apply, is_logic } = jsonLogic;
import Debug from 'debug';
import { SchemaRegistry, SchemaID, ObjectSchema } from '../schema/schema';
import { Lossless, LosslessViewMany, LosslessViewOne, valueFromDelta } from '../views/lossless';
import { Hyperview, HyperviewViewMany, HyperviewViewOne, valueFromDelta } from '../views/hyperview';
import { DomainEntityID } from '../core/types';
import { Delta, DeltaFilter } from '../core/delta';
@ -31,14 +31,14 @@ export interface QueryOptions {
}
export interface QueryResult {
entities: LosslessViewMany;
entities: HyperviewViewMany;
totalFound: number;
limited: boolean;
}
export class QueryEngine {
constructor(
private lossless: Lossless,
private hyperview: Hyperview,
private schemaRegistry: SchemaRegistry
) {}
@ -96,12 +96,12 @@ export class QueryEngine {
const candidateEntityIds = this.discoverEntitiesBySchema(schemaId);
debug(`Found ${candidateEntityIds.length} candidate entities for schema ${schemaId}`);
// 2. Compose lossless views for all candidates
const allViews = this.lossless.compose(candidateEntityIds, options.deltaFilter);
debug(`Composed ${Object.keys(allViews).length} lossless views`);
// 2. Compose hyperview views for all candidates
const allViews = this.hyperview.compose(candidateEntityIds, options.deltaFilter);
debug(`Composed ${Object.keys(allViews).length} hyperview views`);
// 3. Apply JSON Logic filter if provided
let filteredViews: LosslessViewMany = allViews;
let filteredViews: HyperviewViewMany = allViews;
if (filter) {
filteredViews = this.applyJsonLogicFilter(allViews, filter, schemaId);
@ -132,10 +132,10 @@ export class QueryEngine {
/**
* Query for a single entity by ID with schema validation
*/
async queryOne(schemaId: SchemaID, entityId: DomainEntityID): Promise<LosslessViewOne | null> {
async queryOne(schemaId: SchemaID, entityId: DomainEntityID): Promise<HyperviewViewOne | null> {
debug(`Querying single entity ${entityId} with schema ${schemaId}`);
const views = this.lossless.compose([entityId]);
const views = this.hyperview.compose([entityId]);
const view = views[entityId];
if (!view) {
@ -165,7 +165,7 @@ export class QueryEngine {
// Strategy: Find entities that have deltas for the schema's required properties
const requiredProperties = schema.requiredProperties || [];
const allEntityIds = Array.from(this.lossless.domainEntities.keys());
const allEntityIds = Array.from(this.hyperview.domainEntities.keys());
if (requiredProperties.length === 0) {
// No required properties - return all entities
@ -177,7 +177,7 @@ export class QueryEngine {
const candidateEntities: DomainEntityID[] = [];
for (const entityId of allEntityIds) {
const entity = this.lossless.domainEntities.get(entityId);
const entity = this.hyperview.domainEntities.get(entityId);
if (!entity) continue;
// Check if entity has deltas for all required property
@ -195,28 +195,28 @@ export class QueryEngine {
}
/**
* Apply JSON Logic filter to lossless views
* This requires converting each lossless view to a queryable object
* Apply JSON Logic filter to hyperview views
* This requires converting each hyperview view to a queryable object
*/
private applyJsonLogicFilter(
views: LosslessViewMany,
views: HyperviewViewMany,
filter: JsonLogic,
schemaId: SchemaID
): LosslessViewMany {
): HyperviewViewMany {
const schema = this.schemaRegistry.get(schemaId);
if (!schema) {
debug(`Cannot filter without schema ${schemaId}`);
return views;
}
const filteredViews: LosslessViewMany = {};
const filteredViews: HyperviewViewMany = {};
let hasFilterErrors = false;
const filterErrors: string[] = [];
for (const [entityId, view] of Object.entries(views)) {
try {
// Convert lossless view to queryable object using schema
const queryableObject = this.losslessViewToQueryableObject(view, schema);
// Convert hyperview view to queryable object using schema
const queryableObject = this.hyperviewViewToQueryableObject(view, schema);
// Apply JSON Logic filter
const matches = apply(filter, queryableObject);
@ -246,16 +246,16 @@ export class QueryEngine {
}
/**
* Convert a lossless view to a queryable object based on schema
* Convert a hyperview view to a queryable object based on schema
* Uses simple resolution strategies for now
*/
private losslessViewToQueryableObject(view: LosslessViewOne, schema: ObjectSchema): Record<string, unknown> {
private hyperviewViewToQueryableObject(view: HyperviewViewOne, schema: ObjectSchema): Record<string, unknown> {
const obj: Record<string, unknown> = {
id: view.id,
_referencedAs: view.referencedAs
};
// Convert each schema property from lossless view deltas
// Convert each schema property from hyperview view deltas
for (const [propertyId, propertySchema] of Object.entries(schema.properties)) {
const deltas = view.propertyDeltas[propertyId] || [];
@ -315,7 +315,7 @@ export class QueryEngine {
/**
* Check if an entity matches a schema (basic validation)
*/
private entityMatchesSchema(view: LosslessViewOne, schemaId: SchemaID): boolean {
private entityMatchesSchema(view: HyperviewViewOne, schemaId: SchemaID): boolean {
const schema = this.schemaRegistry.get(schemaId);
if (!schema) return false;
@ -337,7 +337,7 @@ export class QueryEngine {
* Get statistics about queryable entities
*/
getStats() {
const totalEntities = this.lossless.domainEntities.size;
const totalEntities = this.hyperview.domainEntities.size;
const registeredSchemas = this.schemaRegistry.list().length;
return {

View File

@ -14,9 +14,9 @@ import {
SchemaApplicationOptions,
ResolutionContext
} from '../schema/schema';
import { Lossless, LosslessViewOne } from '../views/lossless';
import { Hyperview, HyperviewViewOne } from '../views/hyperview';
import { DomainEntityID, PropertyID, PropertyTypes } from '../core/types';
import { CollapsedDelta } from '../views/lossless';
import { CollapsedDelta } from '../views/hyperview';
import { Delta } from '@src/core';
const debug = Debug('rz:schema-registry');
@ -103,7 +103,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
}
}
validate(entityId: DomainEntityID, schemaId: SchemaID, view: LosslessViewOne): SchemaValidationResult {
validate(entityId: DomainEntityID, schemaId: SchemaID, view: HyperviewViewOne): SchemaValidationResult {
const schema = this.get(schemaId);
if (!schema) {
return {
@ -282,7 +282,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
}
applySchema(
view: LosslessViewOne,
view: HyperviewViewOne,
schemaId: SchemaID,
options: SchemaApplicationOptions = {}
): SchemaAppliedView {
@ -333,9 +333,9 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
* Resolves references to other entities according to schema specifications
*/
applySchemaWithNesting(
view: LosslessViewOne,
view: HyperviewViewOne,
schemaId: SchemaID,
losslessView: Lossless,
hyperviewView: Hyperview,
options: SchemaApplicationOptions = {}
): SchemaAppliedViewWithNesting {
const { maxDepth = 3, includeMetadata = true, strictValidation = false } = options;
@ -344,16 +344,16 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
return this.resolveNestedView(
view,
schemaId,
losslessView,
hyperviewView,
resolutionContext,
{ includeMetadata, strictValidation }
);
}
private resolveNestedView(
view: LosslessViewOne,
view: HyperviewViewOne,
schemaId: SchemaID,
losslessView: Lossless,
hyperviewView: Hyperview,
context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean }
): SchemaAppliedViewWithNesting {
@ -401,7 +401,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedViews = this.resolveReferenceProperty(
deltas,
referenceSchema,
losslessView,
hyperviewView,
context.withDepth(context.currentDepth + 1),
options,
view.id
@ -415,7 +415,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedViews = this.resolveReferenceProperty(
deltas,
referenceSchema,
losslessView,
hyperviewView,
context.withDepth(context.currentDepth + 1),
options,
view.id
@ -449,7 +449,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
private resolveReferenceProperty(
deltas: Delta[],
referenceSchema: ReferenceSchema,
losslessView: Lossless,
hyperviewView: Hyperview,
context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean },
parentEntityId: string
@ -469,7 +469,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
delta,
parentEntityId,
referenceSchema.targetSchema,
losslessView,
hyperviewView,
context,
options
);
@ -480,8 +480,8 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const referenceIds = this.extractReferenceIdsFromDelta(delta, parentEntityId);
for (const referenceId of referenceIds) {
try {
// Get the referenced entity's lossless view
const referencedViews = losslessView.compose([referenceId]);
// Get the referenced entity's hyperview view
const referencedViews = hyperviewView.compose([referenceId]);
const referencedView = referencedViews[referenceId];
if (referencedView) {
@ -489,7 +489,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedView = this.resolveNestedView(
referencedView,
referenceSchema.targetSchema,
losslessView,
hyperviewView,
context,
options
);
@ -514,7 +514,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
delta: Delta,
parentEntityId: string,
targetSchema: SchemaID,
losslessView: Lossless,
hyperviewView: Hyperview,
context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean }
): SchemaAppliedViewWithNesting | null {
@ -536,7 +536,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
// Count entity references vs scalars
if (typeof target === 'string') {
const referencedViews = losslessView.compose([target]);
const referencedViews = hyperviewView.compose([target]);
if (referencedViews[target]) {
entityReferenceCount++;
} else {
@ -568,7 +568,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
if (typeof target === 'string') {
// Try to resolve as entity reference
try {
const referencedViews = losslessView.compose([target]);
const referencedViews = hyperviewView.compose([target]);
const referencedView = referencedViews[target];
if (referencedView) {
@ -576,7 +576,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedView = this.resolveNestedView(
referencedView,
targetSchema,
losslessView,
hyperviewView,
context,
options
);
@ -601,14 +601,14 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
if (typeof target === 'string') {
// Try to resolve as entity reference
try {
const referencedViews = losslessView.compose([target]);
const referencedViews = hyperviewView.compose([target]);
const referencedView = referencedViews[target];
if (referencedView) {
const nestedView = this.resolveNestedView(
referencedView,
targetSchema,
losslessView,
hyperviewView,
context,
options
);

View File

@ -1,6 +1,6 @@
import { DomainEntityID, PropertyID, PropertyTypes } from "../core/types";
import { LosslessViewOne } from "../views/lossless";
import { CollapsedDelta } from "../views/lossless";
import { HyperviewViewOne } from "../views/hyperview";
import { CollapsedDelta } from "../views/hyperview";
// Base schema types
export type SchemaID = string;
@ -52,7 +52,7 @@ export interface SchemaRegistry {
register(schema: ObjectSchema): void;
get(id: SchemaID): ObjectSchema | undefined;
list(): ObjectSchema[];
validate(entityId: DomainEntityID, schemaId: SchemaID, view: LosslessViewOne): SchemaValidationResult;
validate(entityId: DomainEntityID, schemaId: SchemaID, view: HyperviewViewOne): SchemaValidationResult;
}
// Validation result types
@ -76,7 +76,7 @@ export interface SchemaApplicationOptions {
strictValidation?: boolean;
}
// Applied schema result - a lossless view filtered through a schema
// Applied schema result - a hyperview view filtered through a schema
export interface SchemaAppliedView {
id: DomainEntityID;
schemaId: SchemaID;
@ -105,7 +105,7 @@ export interface SchemaAppliedViewWithNesting extends SchemaAppliedView {
export interface TypedCollection<T> {
schema: ObjectSchema;
validate(entity: T): SchemaValidationResult;
apply(view: LosslessViewOne): SchemaAppliedView;
apply(view: HyperviewViewOne): SchemaAppliedView;
}
// Built-in schema helpers

View File

@ -9,7 +9,7 @@ import {Transactions} from '../features/transactions';
import {DomainEntityID, PropertyID, PropertyTypes, TransactionID, ViewMany} from "../core/types";
import {Negation} from '../features/negation';
import {NegationHelper} from '../features/negation';
const debug = Debug('rz:lossless');
const debug = Debug('rz:hyperview');
export type CollapsedPointer = {[key: PropertyID]: PropertyTypes};
@ -49,7 +49,7 @@ export function valueFromDelta(
}
// TODO: Store property deltas as references to reduce memory footprint
export type LosslessViewOne = {
export type HyperviewViewOne = {
id: DomainEntityID,
referencedAs?: string[];
propertyDeltas: {
@ -57,21 +57,21 @@ export type LosslessViewOne = {
}
}
export type CollapsedViewOne = Omit<LosslessViewOne, 'propertyDeltas'> & {
export type CollapsedViewOne = Omit<HyperviewViewOne, 'propertyDeltas'> & {
propertyCollapsedDeltas: {
[key: PropertyID]: CollapsedDelta[]
}
};
export type LosslessViewMany = ViewMany<LosslessViewOne>;
export type HyperviewViewMany = ViewMany<HyperviewViewOne>;
export type CollapsedViewMany = ViewMany<CollapsedViewOne>;
class LosslessEntityMap extends Map<DomainEntityID, LosslessEntity> {};
class HyperviewEntityMap extends Map<DomainEntityID, HyperviewEntity> {};
class LosslessEntity {
class HyperviewEntity {
properties = new Map<PropertyID, Set<Delta>>();
constructor(readonly lossless: Lossless, readonly id: DomainEntityID) {}
constructor(readonly hyperview: Hyperview, readonly id: DomainEntityID) {}
addDelta(delta: Delta | DeltaV2) {
// Convert DeltaV2 to DeltaV1 if needed
@ -93,7 +93,7 @@ class LosslessEntity {
propertyDeltas.add(delta);
}
debug(`[${this.lossless.rhizomeNode.config.peerId}]`, `entity ${this.id} added delta:`, JSON.stringify(delta));
debug(`[${this.hyperview.rhizomeNode.config.peerId}]`, `entity ${this.id} added delta:`, JSON.stringify(delta));
}
toJSON() {
@ -103,14 +103,14 @@ class LosslessEntity {
}
return {
id: this.id,
referencedAs: Array.from(this.lossless.referencedAs.get(this.id) ?? []),
referencedAs: Array.from(this.hyperview.referencedAs.get(this.id) ?? []),
properties
};
}
}
export class Lossless {
domainEntities = new LosslessEntityMap();
export class Hyperview {
domainEntities = new HyperviewEntityMap();
transactions: Transactions;
eventStream = new EventEmitter();
@ -160,7 +160,7 @@ export class Lossless {
for (const entityId of affectedEntities) {
let ent = this.domainEntities.get(entityId);
if (!ent) {
ent = new LosslessEntity(this, entityId);
ent = new HyperviewEntity(this, entityId);
this.domainEntities.set(entityId, ent);
}
// Add negation delta to the entity
@ -189,7 +189,7 @@ export class Lossless {
let ent = this.domainEntities.get(target);
if (!ent) {
ent = new LosslessEntity(this, target);
ent = new HyperviewEntity(this, target);
this.domainEntities.set(target, ent);
}
@ -208,7 +208,7 @@ export class Lossless {
return transactionId;
}
decompose(view: LosslessViewOne): Delta[] {
decompose(view: HyperviewViewOne): Delta[] {
const allDeltas: Delta[] = [];
const seenDeltaIds = new Set<DeltaID>();
@ -225,8 +225,8 @@ export class Lossless {
return allDeltas;
}
compose(entityIds?: DomainEntityID[], deltaFilter?: DeltaFilter): LosslessViewMany {
const view: LosslessViewMany = {};
compose(entityIds?: DomainEntityID[], deltaFilter?: DeltaFilter): HyperviewViewMany {
const view: HyperviewViewMany = {};
entityIds = entityIds ?? Array.from(this.domainEntities.keys());
for (const entityId of entityIds) {
@ -391,4 +391,4 @@ export class Lossless {
}
// TODO: point-in-time queries
}
}

View File

@ -1,3 +1,3 @@
export * from './lossless';
export * from './lossy';
export * from './hyperview';
export * from './view';
export * from './resolvers';

View File

@ -1,7 +1,7 @@
import { Lossless, LosslessViewOne } from "../lossless";
import { Lossy } from '../lossy';
import { Hyperview, HyperviewViewOne } from "../hyperview";
import { Lossy } from '../view';
import { DomainEntityID, PropertyID, ViewMany } from "../../core/types";
import { valueFromDelta } from "../lossless";
import { valueFromDelta } from "../hyperview";
import { EntityRecord, EntityRecordMany } from "@src/core/entity";
export type AggregationType = 'min' | 'max' | 'sum' | 'average' | 'count';
@ -53,13 +53,13 @@ function aggregateValues(values: number[], type: AggregationType): number {
export class AggregationResolver extends Lossy<Accumulator, Result> {
constructor(
lossless: Lossless,
hyperview: Hyperview,
private config: AggregationConfig
) {
super(lossless);
super(hyperview);
}
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator {
reducer(acc: Accumulator, cur: HyperviewViewOne): Accumulator {
if (!acc[cur.id]) {
acc[cur.id] = { id: cur.id, properties: {} };
}
@ -111,41 +111,41 @@ export class AggregationResolver extends Lossy<Accumulator, Result> {
// Convenience classes for common aggregation types
export class MinResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'min');
super(lossless, config);
super(hyperview, config);
}
}
export class MaxResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'max');
super(lossless, config);
super(hyperview, config);
}
}
export class SumResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'sum');
super(lossless, config);
super(hyperview, config);
}
}
export class AverageResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'average');
super(lossless, config);
super(hyperview, config);
}
}
export class CountResolver extends AggregationResolver {
constructor(lossless: Lossless, properties: PropertyID[]) {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'count');
super(lossless, config);
super(hyperview, config);
}
}

View File

@ -1,5 +1,5 @@
import { PropertyID, PropertyTypes } from "../../../core/types";
import { CollapsedDelta } from "../../lossless";
import { CollapsedDelta } from "../../hyperview";
import Debug from 'debug';
const debug = Debug('rz:custom-resolver:plugin');

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../../views/lossless";
import { CollapsedDelta } from "../../../hyperview";
import { ResolverPlugin } from "../plugin";
import Debug from 'debug';
const debug = Debug('rz:concatenation-plugin');

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../lossless";
import { CollapsedDelta } from "../../../hyperview";
import { ResolverPlugin } from "../plugin";
type FirstWriteWinsState = {

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../lossless";
import { CollapsedDelta } from "../../../hyperview";
import { ResolverPlugin } from "../plugin";
type LastWriteWinsState = {

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "@src/core/types";
import { CollapsedDelta } from "@src/views/lossless";
import { CollapsedDelta } from "@src/views/hyperview";
import { ResolverPlugin, DependencyStates } from "../plugin";
type RunningAverageState = {

View File

@ -1,5 +1,5 @@
import { Lossless, LosslessViewOne } from "../../lossless";
import { Lossy } from '../../lossy';
import { Hyperview, HyperviewViewOne } from "../../hyperview";
import { Lossy } from '../../view';
import { DomainEntityID, PropertyID, PropertyTypes } from "../../../core/types";
import { ResolverPlugin, DependencyStates } from "./plugin";
import { EntityRecord } from "@src/core/entity";
@ -46,14 +46,14 @@ export class CustomResolver extends Lossy<Accumulator, Result> {
/**
* Creates a new CustomResolver instance
* @param lossless - The Lossless instance to use for delta tracking
* @param hyperview - The Hyperview instance to use for delta tracking
* @param config - A mapping of property IDs to their resolver plugins
*/
constructor(
lossless: Lossless,
hyperview: Hyperview,
config: PluginMap
) {
super(lossless);
super(hyperview);
this.config = config;
this.buildDependencyGraph();
this.executionOrder = this.calculateExecutionOrder();
@ -244,7 +244,7 @@ export class CustomResolver extends Lossy<Accumulator, Result> {
/**
* Update the state with new deltas from the view
*/
reducer(acc: Accumulator, {id: entityId, propertyDeltas}: LosslessViewOne): Accumulator {
reducer(acc: Accumulator, {id: entityId, propertyDeltas}: HyperviewViewOne): Accumulator {
debug(`Processing deltas for entity: ${entityId}`);
debug('Property deltas:', JSON.stringify(propertyDeltas));

View File

@ -1,6 +1,6 @@
import { EntityProperties } from "../../core/entity";
import { Lossless, CollapsedDelta, valueFromDelta, LosslessViewOne } from "../lossless";
import { Lossy } from '../lossy';
import { Hyperview, CollapsedDelta, valueFromDelta, HyperviewViewOne } from "../hyperview";
import { Lossy } from '../view';
import { DomainEntityID, PropertyID, PropertyTypes, Timestamp, ViewMany } from "../../core/types";
import Debug from 'debug';
@ -77,13 +77,13 @@ function compareWithTieBreaking(
export class TimestampResolver extends Lossy<Accumulator, Result> {
constructor(
lossless: Lossless,
hyperview: Hyperview,
private tieBreakingStrategy: TieBreakingStrategy = 'delta-id'
) {
super(lossless);
super(hyperview);
}
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator {
reducer(acc: Accumulator, cur: HyperviewViewOne): Accumulator {
if (!acc[cur.id]) {
acc[cur.id] = { id: cur.id, properties: {} };
}
@ -138,26 +138,26 @@ export class TimestampResolver extends Lossy<Accumulator, Result> {
// Convenience classes for different tie-breaking strategies
export class CreatorIdTimestampResolver extends TimestampResolver {
constructor(lossless: Lossless) {
super(lossless, 'creator-id');
constructor(hyperview: Hyperview) {
super(hyperview, 'creator-id');
}
}
export class DeltaIdTimestampResolver extends TimestampResolver {
constructor(lossless: Lossless) {
super(lossless, 'delta-id');
constructor(hyperview: Hyperview) {
super(hyperview, 'delta-id');
}
}
export class HostIdTimestampResolver extends TimestampResolver {
constructor(lossless: Lossless) {
super(lossless, 'host-id');
constructor(hyperview: Hyperview) {
super(hyperview, 'host-id');
}
}
export class LexicographicTimestampResolver extends TimestampResolver {
constructor(lossless: Lossless) {
super(lossless, 'lexicographic');
constructor(hyperview: Hyperview) {
super(hyperview, 'lexicographic');
}
}

View File

@ -1,12 +1,12 @@
// We have the lossless transformation of the delta stream.
// We want to enable transformations from the lossless view,
// into various possible "lossy" views that combine or exclude some information.
// We have the hyperview transformation of the delta stream.
// We want to enable transformations from the hyperview view,
// into various possible "view" views that combine or exclude some information.
import Debug from 'debug';
import {Delta, DeltaFilter, DeltaID} from "../core/delta";
import {Lossless, LosslessViewOne} from "./lossless";
import {Hyperview, HyperviewViewOne} from "./hyperview";
import {DomainEntityID, PropertyID, PropertyTypes, ViewMany} from "../core/types";
const debug = Debug('rz:lossy');
const debug = Debug('rz:view');
type PropertyMap = Record<PropertyID, PropertyTypes>;
@ -17,20 +17,20 @@ export type LossyViewOne<T = PropertyMap> = {
export type LossyViewMany<T = PropertyMap> = ViewMany<LossyViewOne<T>>;
// We support incremental updates of lossy models.
// We support incremental updates of view models.
export abstract class Lossy<Accumulator, Result = Accumulator> {
deltaFilter?: DeltaFilter;
private accumulator?: Accumulator;
initializer?(): Accumulator;
abstract reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator;
abstract reducer(acc: Accumulator, cur: HyperviewViewOne): Accumulator;
resolver?(acc: Accumulator, entityIds: DomainEntityID[]): Result;
constructor(
readonly lossless: Lossless,
readonly hyperview: Hyperview,
) {
this.lossless.eventStream.on("updated", (id, deltaIds) => {
debug(`[${this.lossless.rhizomeNode.config.peerId}] entity ${id} updated, deltaIds:`,
this.hyperview.eventStream.on("updated", (id, deltaIds) => {
debug(`[${this.hyperview.rhizomeNode.config.peerId}] entity ${id} updated, deltaIds:`,
JSON.stringify(deltaIds));
this.ingestUpdate(id, deltaIds);
@ -46,16 +46,16 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
if (!this.deltaFilter) return true;
return this.deltaFilter(delta);
};
const losslessPartial = this.lossless.compose([entityId], combinedFilter);
const hyperviewPartial = this.hyperview.compose([entityId], combinedFilter);
if (!losslessPartial) {
// This should not happen; this should only be called after the lossless view has been updated
console.error(`Lossless view for entity ${entityId} not found`);
if (!hyperviewPartial) {
// This should not happen; this should only be called after the hyperview view has been updated
console.error(`Hyperview view for entity ${entityId} not found`);
return;
}
const latest = this.accumulator || this.initializer?.() || {} as Accumulator;
this.accumulator = this.reducer(latest, losslessPartial[entityId]);
this.accumulator = this.reducer(latest, hyperviewPartial[entityId]);
}
// Resolve the current state of the view
@ -65,7 +65,7 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
}
if (!entityIds) {
entityIds = Array.from(this.lossless.domainEntities.keys());
entityIds = Array.from(this.hyperview.domainEntities.keys());
}
if (!this.resolver) {
@ -76,3 +76,5 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
}
}
// "Lossy" can simply be called "View"
export abstract class View<Accumulator, Result> extends Lossy<Accumulator, Result> {};

View File

@ -7,7 +7,7 @@
- **Delta Lifecycle**:
- Creation via `DeltaBuilder`
- Propagation through `DeltaStream`
- Storage in `Lossless` view
- Storage in `Hyperview` view
- Transformation in `Lossy` views
### 2. Network Layer
@ -22,7 +22,7 @@
### 3. Storage
- **In-Memory Storage**:
- `Lossless` view maintains complete delta history
- `Hyperview` view maintains complete delta history
- `Lossy` views provide optimized access patterns
- **Persistence**:
- LevelDB integration
@ -49,8 +49,8 @@
- `request-reply.ts`: Direct node communication
3. **Views**:
- `lossless.ts`: Complete delta history
- `lossy.ts`: Derived, optimized views
- `hyperview.ts`: Complete delta history
- `view.ts`: Derived, optimized views
4. **Schema**:
- `schema.ts`: Type definitions

12
todo.md
View File

@ -12,7 +12,7 @@ This document tracks work needed to achieve full specification compliance, organ
- [x] Add validation for pointer consistency
### 1.2 Complete Transaction Support ✅ (mostly)
- [x] Implement transaction-based filtering in lossless views
- [x] Implement transaction-based filtering in hyperview views
- [x] Add transaction grouping in delta streams
- [x] Test atomic transaction operations
- [ ] Add transaction rollback capabilities (deferred - not critical for spec parity)
@ -29,8 +29,8 @@ This document tracks work needed to achieve full specification compliance, organ
### 2.1 Negation Deltas ✅
- [x] Implement negation delta type with "negates" pointer
- [x] Add "negated_by" context handling
- [x] Update lossless view to handle negations
- [x] Update lossy resolvers to respect negations
- [x] Update hyperview view to handle negations
- [x] Update view resolvers to respect negations
- [x] Add comprehensive negation tests
### 2.2 Advanced Conflict Resolution ✅
@ -50,7 +50,7 @@ This document tracks work needed to achieve full specification compliance, organ
### 3.1 Query Engine Foundation ✅
- [x] Implement JSON Logic parser (using json-logic-js)
- [x] Create query planner for lossless views
- [x] Create query planner for hyperview views
- [x] Add query execution engine (QueryEngine class)
- [x] Implement schema-driven entity discovery
- [x] Enable the skipped query tests
@ -91,7 +91,7 @@ This document tracks work needed to achieve full specification compliance, organ
### 4.4 Schema-as-Deltas (Meta-Schema System)
- [ ] Define schema entities that are stored as deltas in the system
- [ ] Implement schema queries that return schema instances from lossless views
- [ ] Implement schema queries that return schema instances from hyperview views
- [ ] Create schema evolution through delta mutations
- [ ] Add temporal schema queries (schema time-travel)
- [ ] Build schema conflict resolution for competing schema definitions
@ -186,7 +186,7 @@ This document tracks work needed to achieve full specification compliance, organ
1. **Phase 1** ✅ - These are foundational requirements
2. **Phase 2.1 (Negation)** ✅ - Core spec feature that affects all views
3. **Phase 2.2 (Resolvers)** ✅ - Needed for proper lossy views
3. **Phase 2.2 (Resolvers)** ✅ - Needed for proper view views
4. **Phase 2.3 (Nesting)** ✅ - Depends on schemas and queries
5. **Phase 3 (Query)** ✅ - Unlocks powerful data access
6. **Phase 4 (Relational)** - Builds on query system