Compare commits

..

No commits in common. "b65585edf9d4816bb49be029ecd1236a9964c0d4" and "67d29f4c5def6dd2cffd5a1a4cfd1246584bdfc0" have entirely different histories.

68 changed files with 746 additions and 862 deletions

View File

@ -42,7 +42,7 @@ npm run example-app
- DeltaV2 is the current format (DeltaV1 is legacy)
2. **Views**: Different ways to interpret the delta stream:
- **Hyperview View**: Stores all deltas without conflict resolution
- **Lossless View**: Stores all deltas without conflict resolution
- **Lossy Views**: Apply conflict resolution (e.g., Last-Write-Wins)
- Custom resolvers can be implemented
@ -60,7 +60,7 @@ npm run example-app
- `src/node.ts`: Main `RhizomeNode` class orchestrating all components
- `src/delta.ts`: Delta data structures and conversion logic
- `src/hyperview.ts`: Core hyperview implementation
- `src/lossless.ts`: Core lossless view implementation
- `src/collection-basic.ts`: Basic collection implementation
- `src/http/api.ts`: REST API endpoints
- `src/pub-sub.ts`: Network communication layer
@ -78,7 +78,7 @@ The HTTP API provides RESTful endpoints:
- `GET/PUT /collection/:name/:id` - Entity operations
- `GET /peers` - Peer information
- `GET /deltas/stats` - Delta statistics
- `GET /hyperview/:entityId` - Raw delta access
- `GET /lossless/:entityId` - Raw delta access
### Important Implementation Notes

View File

@ -150,13 +150,13 @@ curl -s -X PUT -H 'content-type:application/json' -d @/tmp/user.json http://loca
| Peering | Yes | Implemented with ZeroMQ and/or Libp2p. Libp2p solves more problems. |
| Schemas | Not really | Currently very thin layer allowing TypedCollections |
| Relationships | No | Supporting relational algebra among domain entities |
| Views | Yes | Hyperview: Map the `targetContext`s as properties of domain entities. |
| Views | Yes | Lossless: Map the `targetContext`s as properties of domain entities. |
| | | Lossy: Use a delta filter and a resolver function to produce a view. |
| | | Currently using functions rather than JSON-Logic expressions. |
| Functions | No | Arbitrary subscribers to delta stream (that can also emit deltas?) |
| Tests | Yes | We are set up to run unit tests and multi-node tests |
| Identity | Sort of | We have an identity service via Libp2p |
| Contexts | No | Each context may involve different view functions and delta filters |
| Contexts | No | Each context may involve different lossy functions and delta filters |
| HTTP API | Yes | Basic peering info and entity CRUD |
If we express views and filter rules as JSON-Logic, we can easily include them in records.

View File

@ -90,7 +90,7 @@ graph TD
- Design storage layer (Mnesia/ETS)
- **View System**
- Implement Lossy/Hyperviews
- Implement Lossy/Lossless views
- Create resolver framework
- Add caching layer
@ -237,7 +237,7 @@ sequenceDiagram
- [ ] Define delta types and operations
- [ ] Implement DeltaBuilder
- [ ] Basic storage with Mnesia/ETS
- [ ] View system with Lossy/Hyperview support
- [ ] View system with Lossy/Lossless support
### 2. Distributed Foundation
- [ ] Node discovery and membership

View File

@ -2,10 +2,10 @@
- [x] Organize tests?
- [x] More documentation in docs/
- [x] Rename/consolidate, hyperview view() and compose()
- [x] Rename Lossless to HyperView
- [x] Rename Lossy to View
- [x] Consider whether we should use collapsed deltas
- [ ] Rename/consolidate, lossless view() and compose() --> composeView()
- [ ] Rename Lossless to HyperView
- [ ] Rename Lossy to View
- [ ] Consider whether we should use collapsed deltas
- [ ] Improve ergonomics of declaring multiple entity properties in one delta
- [x] Use dotenv so we can more easily manage the local dev test environment
- [x] Create test helpers to reduce boilerplate
- [ ] Create test helpers to reduce boilerplate

View File

@ -1,5 +1,5 @@
# Test structure
- before test, initialize node and hyperview
- before test, initialize node and lossless view
- when test begins, create and ingest a series of deltas
- instantiate a resolver, in this case using custom resolver plugins
- call the resolver's initializer with the view

View File

@ -1,5 +1,5 @@
import { RhizomeNode } from '@src';
import { Hyperview } from '@src/views/hyperview';
import { Lossless } from '@src/views/lossless';
import { Delta } from '@src/core/delta';
import { createDelta } from '@src/core/delta-builder';
import { CustomResolver } from '@src/views/resolvers/custom-resolvers';
@ -29,12 +29,12 @@ export async function testResolverWithPlugins<T extends TestPluginMap>(
// Setup test environment
const node = new RhizomeNode();
const hyperview = new Hyperview(node);
const view = new CustomResolver(hyperview, plugins);
const lossless = new Lossless(node);
const view = new CustomResolver(lossless, plugins);
// Ingest all deltas through the hyperview instance
// Ingest all deltas through the lossless instance
for (const delta of deltas) {
hyperview.ingestDelta(delta);
lossless.ingestDelta(delta);
}
// Get the resolved view

View File

@ -1,4 +1,4 @@
import { HyperviewOne } from '@src/views/hyperview';
import { LosslessViewOne } from '@src/views/lossless';
import {
SchemaBuilder,
PrimitiveSchemas,
@ -153,12 +153,12 @@ describe('Schema System', () => {
expect(schemaRegistry.hasCircularDependencies()).toBe(true);
});
test('should validate hyperviews against schemas', () => {
test('should validate lossless views against schemas', () => {
const userSchema = CommonSchemas.User();
schemaRegistry.register(userSchema);
// Create a valid hyperview
const validView: HyperviewOne = {
// Create a valid lossless view
const validView: LosslessViewOne = {
id: 'user123',
propertyDeltas: {
name: [
@ -179,7 +179,7 @@ describe('Schema System', () => {
expect(result.errors).toHaveLength(0);
// Test invalid view (missing required property)
const invalidView: HyperviewOne = {
const invalidView: LosslessViewOne = {
id: 'user456',
propertyDeltas: {
age: [
@ -212,7 +212,7 @@ describe('Schema System', () => {
schemaRegistry.register(schema);
// Valid types
const validView: HyperviewOne = {
const validView: LosslessViewOne = {
id: 'test1',
propertyDeltas: {
stringProp: [
@ -240,7 +240,7 @@ describe('Schema System', () => {
expect(validResult.valid).toBe(true);
// Invalid types
const invalidView: HyperviewOne = {
const invalidView: LosslessViewOne = {
id: 'test2',
propertyDeltas: {
stringProp: [
@ -331,7 +331,7 @@ describe('Schema System', () => {
.addPointer('users', 'user3', 'email')
.addPointer('email', 'invalid@test.com')
.buildV1();
node.hyperview.ingestDelta(invalidDelta);
node.lossless.ingestDelta(invalidDelta);
const stats = collection.getValidationStats();
expect(stats.totalEntities).toBe(3);
@ -355,11 +355,11 @@ describe('Schema System', () => {
const invalidDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty('user3', 'age', 'not-a-number', 'users')
.buildV1();
node.hyperview.ingestDelta(invalidDelta);
node.lossless.ingestDelta(invalidDelta);
debug(`Manually ingested invalid delta: ${JSON.stringify(invalidDelta)}`)
debug(`Hyperview: ${JSON.stringify(node.hyperview.compose(), null, 2)}`)
debug(`Lossless view: ${JSON.stringify(node.lossless.compose(), null, 2)}`)
const validIds = collection.getValidEntities();
expect(validIds).toContain('user1');
@ -371,7 +371,7 @@ describe('Schema System', () => {
expect(invalidEntities[0].entityId).toBe('user3');
});
test('should apply schema to hyperviews', async () => {
test('should apply schema to lossless views', async () => {
const userSchema = CommonSchemas.User();
const collection = new TypedCollectionImpl<{
name: string;

View File

@ -1,7 +1,7 @@
import { createDelta } from '@src/core/delta-builder';
import {
RhizomeNode,
Hyperview,
Lossless,
SumResolver,
CustomResolver,
LastWriteWinsPlugin,
@ -13,28 +13,28 @@ const debug = Debug('rz:test:performance');
describe('Concurrent Write Scenarios', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
describe('Simultaneous Writes with Same Timestamp', () => {
test('should handle simultaneous writes using last-write-wins resolver', () => {
const resolver = new TimestampResolver(hyperview);
const resolver = new TimestampResolver(lossless);
const timestamp = 1000;
// Simulate two writers updating the same property at the exact same time
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withId('delta-a')
.withTimestamp(timestamp)
.setProperty('entity1', 'score', 100, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('writer2', 'host2')
lossless.ingestDelta(createDelta('writer2', 'host2')
.withId('delta-b')
.withTimestamp(timestamp) // Same timestamp
.setProperty('entity1', 'score', 200, 'collection')
@ -52,16 +52,16 @@ describe('Concurrent Write Scenarios', () => {
test('should handle simultaneous writes using timestamp resolver with tie-breaking', () => {
const timestamp = 1000;
const resolver = new TimestampResolver(hyperview, 'creator-id');
const resolver = new TimestampResolver(lossless, 'creator-id');
hyperview.ingestDelta(createDelta('writer_z', 'host1') // Lexicographically later
lossless.ingestDelta(createDelta('writer_z', 'host1') // Lexicographically later
.withId('delta-a')
.withTimestamp(timestamp)
.setProperty('entity1', 'score', 100, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('writer_a', 'host2') // Lexicographically earlier
lossless.ingestDelta(createDelta('writer_a', 'host2') // Lexicographically earlier
.withId('delta-b')
.withTimestamp(timestamp) // Same timestamp
.setProperty('entity1', 'score', 200, 'collection')
@ -76,23 +76,23 @@ describe('Concurrent Write Scenarios', () => {
});
test('should handle multiple writers with aggregation resolver', () => {
const resolver = new SumResolver(hyperview, ['points']);
const resolver = new SumResolver(lossless, ['points']);
// Multiple writers add values simultaneously
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'points', 10, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('writer2', 'host2')
lossless.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(1000) // Same timestamp
.setProperty('entity1', 'points', 20, 'collection')
.buildV1()
);
// Third writer adds another value
hyperview.ingestDelta(createDelta('writer3', 'host3')
lossless.ingestDelta(createDelta('writer3', 'host3')
.withTimestamp(1000) // Same timestamp
.setProperty('entity1', 'points', 30, 'collection')
.buildV1()
@ -108,10 +108,10 @@ describe('Concurrent Write Scenarios', () => {
describe('Out-of-Order Write Arrival', () => {
test('should handle writes arriving out of chronological order', () => {
const resolver = new TimestampResolver(hyperview);
const resolver = new TimestampResolver(lossless);
// Newer delta arrives first
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(2000)
.addPointer('collection', 'entity1', 'value')
.addPointer('value', 'newer')
@ -119,7 +119,7 @@ describe('Concurrent Write Scenarios', () => {
);
// Older delta arrives later
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'value')
.addPointer('value', 'older')
@ -134,24 +134,24 @@ describe('Concurrent Write Scenarios', () => {
});
test('should maintain correct aggregation despite out-of-order arrival', () => {
const resolver = new SumResolver(hyperview, ['score']);
const resolver = new SumResolver(lossless, ['score']);
// Add deltas in reverse chronological order
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(3000)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 30)
.buildV1()
);
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 10)
.buildV1()
);
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(2000)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 20)
@ -168,7 +168,7 @@ describe('Concurrent Write Scenarios', () => {
describe('High-Frequency Concurrent Updates', () => {
test('should handle rapid concurrent updates to the same entity', () => {
const resolver = new SumResolver(hyperview, ['counter']);
const resolver = new SumResolver(lossless, ['counter']);
const baseTimestamp = 1000;
const numWriters = 10;
@ -177,7 +177,7 @@ describe('Concurrent Write Scenarios', () => {
// Simulate multiple writers making rapid updates
for (let writer = 0; writer < numWriters; writer++) {
for (let write = 0; write < writesPerWriter; write++) {
hyperview.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
lossless.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
.withTimestamp(baseTimestamp + write)
.addPointer('collection', 'entity1', 'counter')
.addPointer('counter', 1)
@ -194,7 +194,7 @@ describe('Concurrent Write Scenarios', () => {
});
test('should handle concurrent updates to multiple properties', () => {
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
name: new LastWriteWinsPlugin(),
score: new LastWriteWinsPlugin()
});
@ -202,14 +202,14 @@ describe('Concurrent Write Scenarios', () => {
const timestamp = 1000;
// Writer 1 updates name and score
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'name')
.addPointer('name', 'alice')
.buildV1()
);
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp + 1)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 100)
@ -217,14 +217,14 @@ describe('Concurrent Write Scenarios', () => {
);
// Writer 2 updates name and score concurrently
hyperview.ingestDelta(createDelta('writer2', 'host2')
lossless.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp + 2)
.addPointer('collection', 'entity1', 'name')
.addPointer('name', 'bob')
.buildV1()
);
hyperview.ingestDelta(createDelta('writer2', 'host2')
lossless.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp + 3)
.addPointer('collection', 'entity1', 'score')
.addPointer('score', 200)
@ -241,13 +241,13 @@ describe('Concurrent Write Scenarios', () => {
describe('Cross-Entity Concurrent Writes', () => {
test('should handle concurrent writes to different entities', () => {
const resolver = new TimestampResolver(hyperview);
const resolver = new TimestampResolver(lossless);
const timestamp = 1000;
// Multiple writers updating different entities simultaneously
for (let i = 0; i < 5; i++) {
hyperview.ingestDelta(createDelta(`writer${i}`, `host${i}`)
lossless.ingestDelta(createDelta(`writer${i}`, `host${i}`)
.withTimestamp(timestamp)
.addPointer('collection', `entity${i}`, 'value')
.addPointer('value', (i + 1) * 10)
@ -266,28 +266,28 @@ describe('Concurrent Write Scenarios', () => {
});
test('should handle mixed entity and property conflicts', () => {
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
votes: new MajorityVotePlugin(),
status: new LastWriteWinsPlugin()
});
const timestamp = 1000;
// Entity1: Multiple writers competing for same property
hyperview.ingestDelta(createDelta('writer1', 'host1')
lossless.ingestDelta(createDelta('writer1', 'host1')
.withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_a')
.buildV1()
);
hyperview.ingestDelta(createDelta('writer2', 'host2')
lossless.ingestDelta(createDelta('writer2', 'host2')
.withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_a')
.buildV1()
);
hyperview.ingestDelta(createDelta('writer3', 'host3')
lossless.ingestDelta(createDelta('writer3', 'host3')
.withTimestamp(timestamp)
.addPointer('collection', 'entity1', 'votes')
.addPointer('votes', 'option_b')
@ -295,7 +295,7 @@ describe('Concurrent Write Scenarios', () => {
);
// Entity2: Single writer, no conflict
hyperview.ingestDelta(createDelta('writer4', 'host4')
lossless.ingestDelta(createDelta('writer4', 'host4')
.withTimestamp(timestamp)
.addPointer('collection', 'entity2', 'status')
.addPointer('status', 'active')
@ -312,7 +312,7 @@ describe('Concurrent Write Scenarios', () => {
describe('Stress Testing', () => {
test('should handle large number of concurrent writes efficiently', () => {
const resolver = new SumResolver(hyperview, ['score']);
const resolver = new SumResolver(lossless, ['score']);
const numEntities = 100;
const numWritersPerEntity = 10;
@ -321,7 +321,7 @@ describe('Concurrent Write Scenarios', () => {
// Generate a large number of concurrent writes
for (let entity = 0; entity < numEntities; entity++) {
for (let writer = 0; writer < numWritersPerEntity; writer++) {
hyperview.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
lossless.ingestDelta(createDelta(`writer${writer}`, `host${writer}`)
.withTimestamp(baseTimestamp + Math.floor(Math.random() * 1000))
.addPointer('collection', `entity${entity}`, 'score')
.addPointer('score', Math.floor(Math.random() * 100))
@ -344,14 +344,14 @@ describe('Concurrent Write Scenarios', () => {
});
test('should maintain consistency under rapid updates and resolution calls', () => {
const resolver = new SumResolver(hyperview, ['counter']);
const resolver = new SumResolver(lossless, ['counter']);
const entityId = 'stress-test-entity';
let updateCount = 0;
// Add initial deltas
for (let i = 0; i < 50; i++) {
hyperview.ingestDelta(createDelta(
lossless.ingestDelta(createDelta(
`writer${i % 5}`,
`host${i % 3}`
)
@ -370,7 +370,7 @@ describe('Concurrent Write Scenarios', () => {
// Add more deltas and verify consistency
for (let i = 0; i < 25; i++) {
hyperview.ingestDelta(createDelta('late-writer', 'late-host')
lossless.ingestDelta(createDelta('late-writer', 'late-host')
.withTimestamp(2000 + i)
.addPointer('collection', entityId, 'counter')
.addPointer('counter', 2)

View File

@ -83,7 +83,7 @@ describe('Nested Object Resolution Performance', () => {
const friendshipDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'friends', friendId, 'users')
.buildV1();
node.hyperview.ingestDelta(friendshipDelta);
node.lossless.ingestDelta(friendshipDelta);
}
}
@ -96,7 +96,7 @@ describe('Nested Object Resolution Performance', () => {
const followDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'followers', followerId, 'users')
.buildV1();
node.hyperview.ingestDelta(followDelta);
node.lossless.ingestDelta(followDelta);
}
}
@ -107,7 +107,7 @@ describe('Nested Object Resolution Performance', () => {
const mentorshipDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(userId, 'mentor', mentorId, 'users')
.buildV1();
node.hyperview.ingestDelta(mentorshipDelta);
node.lossless.ingestDelta(mentorshipDelta);
}
}
@ -116,7 +116,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution performance for a user with many connections
const testUserId = userIds[50]; // Pick a user in the middle
const userViews = node.hyperview.compose([testUserId]);
const userViews = node.lossless.compose([testUserId]);
const userView = userViews[testUserId];
const startResolution = performance.now();
@ -124,7 +124,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting(
userView,
'network-user',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -197,7 +197,7 @@ describe('Nested Object Resolution Performance', () => {
const linkDelta = createDelta(node.config.creator, node.config.peerId)
.setProperty(currentId, 'next', nextId, 'users')
.buildV1();
node.hyperview.ingestDelta(linkDelta);
node.lossless.ingestDelta(linkDelta);
}
const setupTime = performance.now() - startSetup;
@ -205,7 +205,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution from the start of the chain
const firstUserId = userIds[0];
const userViews = node.hyperview.compose([firstUserId]);
const userViews = node.lossless.compose([firstUserId]);
const userView = userViews[firstUserId];
const startResolution = performance.now();
@ -213,7 +213,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting(
userView,
'chain-user',
node.hyperview,
node.lossless,
{ maxDepth: 5 } // Should resolve 5 levels deep
);
@ -292,7 +292,7 @@ describe('Nested Object Resolution Performance', () => {
.addPointer('users', userId, 'connections')
.addPointer('connections', connectedId)
.buildV1();
node.hyperview.ingestDelta(connectionDelta);
node.lossless.ingestDelta(connectionDelta);
}
}
@ -301,7 +301,7 @@ describe('Nested Object Resolution Performance', () => {
// Test resolution performance with circular references
const testUserId = userIds[0];
const userViews = node.hyperview.compose([testUserId]);
const userViews = node.lossless.compose([testUserId]);
const userView = userViews[testUserId];
const startResolution = performance.now();
@ -309,7 +309,7 @@ describe('Nested Object Resolution Performance', () => {
const nestedView = schemaRegistry.applySchemaWithNesting(
userView,
'circular-user',
node.hyperview,
node.lossless,
{ maxDepth: 3 }
);

View File

@ -1,13 +1,13 @@
/**
* Tests for hyperview compose() and decompose() bidirectional conversion
* Ensures that deltas can be composed into hyperviews and decomposed back
* Tests for lossless view compose() and decompose() bidirectional conversion
* Ensures that deltas can be composed into lossless views and decomposed back
* to the original deltas with all pointer relationships preserved.
*/
import { RhizomeNode } from '@src/node';
import { createDelta } from '@src/core/delta-builder';
describe('Hyperview View Compose/Decompose', () => {
describe('Lossless View Compose/Decompose', () => {
let node: RhizomeNode;
beforeEach(() => {
@ -29,10 +29,10 @@ describe('Hyperview View Compose/Decompose', () => {
];
// Ingest the deltas
nameDeltas.forEach(delta => node.hyperview.ingestDelta(delta));
nameDeltas.forEach(delta => node.lossless.ingestDelta(delta));
// Compose hyperview
const composed = node.hyperview.compose(['alice']);
// Compose lossless view
const composed = node.lossless.compose(['alice']);
const aliceView = composed['alice'];
expect(aliceView).toBeDefined();
@ -41,7 +41,7 @@ describe('Hyperview View Compose/Decompose', () => {
expect(aliceView.propertyDeltas.email).toHaveLength(1);
// Decompose back to deltas
const decomposed = node.hyperview.decompose(aliceView);
const decomposed = node.lossless.decompose(aliceView);
expect(decomposed).toHaveLength(2);
@ -73,12 +73,12 @@ describe('Hyperview View Compose/Decompose', () => {
.addPointer('intensity', 8)
.buildV1();
node.hyperview.ingestDelta(relationshipDelta);
node.lossless.ingestDelta(relationshipDelta);
// Compose and decompose
const composed = node.hyperview.compose(['alice']);
const composed = node.lossless.compose(['alice']);
const aliceView = composed['alice'];
const decomposed = node.hyperview.decompose(aliceView);
const decomposed = node.lossless.decompose(aliceView);
expect(decomposed).toHaveLength(1);
const reconstituted = decomposed[0];
@ -119,16 +119,16 @@ describe('Hyperview View Compose/Decompose', () => {
.addPointer('friend', 'bob', 'friends')
.buildV1();
[aliceDelta, bobDelta, friendshipDelta].forEach(d => node.hyperview.ingestDelta(d));
[aliceDelta, bobDelta, friendshipDelta].forEach(d => node.lossless.ingestDelta(d));
// Compose Alice's view
const composed = node.hyperview.compose(['alice']);
const composed = node.lossless.compose(['alice']);
const aliceView = composed['alice'];
expect(aliceView.propertyDeltas.friends).toHaveLength(1);
// Decompose and verify the friendship delta is correctly reconstructed
const decomposed = node.hyperview.decompose(aliceView);
const decomposed = node.lossless.decompose(aliceView);
const friendshipReconstituted = decomposed.find(d =>
d.pointers.some(p => p.localContext === 'friend')
);
@ -152,10 +152,10 @@ describe('Hyperview View Compose/Decompose', () => {
.addPointer('name', 'Alice')
.buildV1();
node.hyperview.ingestDelta(originalDelta);
node.lossless.ingestDelta(originalDelta);
const composed = node.hyperview.compose(['alice']);
const decomposed = node.hyperview.decompose(composed['alice']);
const composed = node.lossless.compose(['alice']);
const decomposed = node.lossless.decompose(composed['alice']);
expect(decomposed).toHaveLength(1);
const reconstituted = decomposed[0];
@ -184,15 +184,15 @@ describe('Hyperview View Compose/Decompose', () => {
.buildV1()
];
nameDeltas.forEach(d => node.hyperview.ingestDelta(d));
nameDeltas.forEach(d => node.lossless.ingestDelta(d));
const composed = node.hyperview.compose(['alice']);
const composed = node.lossless.compose(['alice']);
const aliceView = composed['alice'];
// Should have 3 deltas for the name property
expect(aliceView.propertyDeltas.name).toHaveLength(3);
const decomposed = node.hyperview.decompose(aliceView);
const decomposed = node.lossless.decompose(aliceView);
// Should decompose back to 3 separate deltas
expect(decomposed).toHaveLength(3);

View File

@ -1,6 +1,6 @@
import { createDelta } from '@src/core/delta-builder';
import { DeltaV1, DeltaV2 } from '@src/core/delta';
import { Hyperview } from '@src/views/hyperview';
import { Lossless } from '@src/views/lossless';
import { RhizomeNode } from '@src/node';
import { TimestampResolver } from '@src/views/resolvers/timestamp-resolvers';
@ -45,10 +45,10 @@ describe('DeltaBuilder', () => {
});
// Verify that the entity property resolves correctly
const hyperview = new Hyperview(node);
const view = new TimestampResolver(hyperview);
hyperview.ingestDelta(delta);
const result = view.resolve();
const lossless = new Lossless(node);
const lossy = new TimestampResolver(lossless);
lossless.ingestDelta(delta);
const result = lossy.resolve();
expect(result).toBeDefined();
expect(result!['entity-1'].properties.name).toBe('Test Entity');
});
@ -70,10 +70,10 @@ describe('DeltaBuilder', () => {
});
// Verify that the entity property resolves correctly
const hyperview = new Hyperview(node);
const view = new TimestampResolver(hyperview);
hyperview.ingestDelta(delta);
const result = view.resolve();
const lossless = new Lossless(node);
const lossy = new TimestampResolver(lossless);
lossless.ingestDelta(delta);
const result = lossy.resolve();
expect(result).toBeDefined();
expect(result!['entity-1'].properties.name).toBe('Test Entity');
});
@ -170,10 +170,10 @@ describe('DeltaBuilder', () => {
expect(delta.pointers).toHaveProperty('target', 'user-2');
expect(delta.pointers).toHaveProperty('type', 'follows');
const hyperview = new Hyperview(node);
const view = new TimestampResolver(hyperview);
hyperview.ingestDelta(delta);
const result = view.resolve([relId]);
const lossless = new Lossless(node);
const lossy = new TimestampResolver(lossless);
lossless.ingestDelta(delta);
const result = lossy.resolve([relId]);
expect(result).toBeDefined();
expect(result![relId]).toMatchObject({
properties: {
@ -200,10 +200,10 @@ describe('DeltaBuilder', () => {
expect(delta.pointers).toHaveProperty('type', 'follows');
expect(delta.pointers).toHaveProperty('version', 1);
const hyperview = new Hyperview(node);
const view = new TimestampResolver(hyperview);
hyperview.ingestDelta(delta);
const result = view.resolve([relId]);
const lossless = new Lossless(node);
const lossy = new TimestampResolver(lossless);
lossless.ingestDelta(delta);
const result = lossy.resolve([relId]);
expect(result).toBeDefined();
expect(result![relId]).toMatchObject({
properties: {

View File

@ -2,17 +2,17 @@ import Debug from 'debug';
import { createDelta } from '@src/core/delta-builder';
import { NegationHelper } from '@src/features';
import { RhizomeNode } from '@src/node';
import { Hyperview } from '@src/views';
import { Lossless } from '@src/views';
const debug = Debug('rz:negation:test');
describe('Negation System', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
describe('Negation Helper', () => {
@ -179,8 +179,8 @@ describe('Negation System', () => {
});
});
describe('Hyperview View Integration', () => {
test('should filter negated deltas in hyperviews', () => {
describe('Lossless View Integration', () => {
test('should filter negated deltas in lossless views', () => {
// Create original delta
const originalDelta = createDelta('user1', 'host1')
.setProperty('user123', 'name', 'Alice')
@ -198,12 +198,12 @@ describe('Negation System', () => {
.buildV1();
// Ingest all deltas
hyperview.ingestDelta(originalDelta);
hyperview.ingestDelta(negationDelta);
hyperview.ingestDelta(nonNegatedDelta);
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta);
lossless.ingestDelta(nonNegatedDelta);
// Get view - should only show non-negated delta
const view = hyperview.compose(['user123']);
const view = lossless.compose(['user123']);
expect(view.user123).toBeDefined();
@ -220,11 +220,11 @@ describe('Negation System', () => {
const negation1 = createDelta('mod1', 'host1').negate(originalDelta.id).buildV1();
const negation2 = createDelta('mod2', 'host1').negate(originalDelta.id).buildV1();
hyperview.ingestDelta(originalDelta);
hyperview.ingestDelta(negation1);
hyperview.ingestDelta(negation2);
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negation1);
lossless.ingestDelta(negation2);
const view = hyperview.compose(['post1']);
const view = lossless.compose(['post1']);
// Original delta should be negated (not visible)
expect(view.post1).toBeUndefined();
@ -241,11 +241,11 @@ describe('Negation System', () => {
const negation1 = createDelta('mod1', 'host1').negate(delta1.id).buildV1();
hyperview.ingestDelta(delta1);
hyperview.ingestDelta(delta2);
hyperview.ingestDelta(negation1);
lossless.ingestDelta(delta1);
lossless.ingestDelta(delta2);
lossless.ingestDelta(negation1);
const stats = hyperview.getNegationStats('article1');
const stats = lossless.getNegationStats('article1');
expect(stats.totalDeltas).toBe(3);
expect(stats.negationDeltas).toBe(1);
@ -262,10 +262,10 @@ describe('Negation System', () => {
const negationDelta = createDelta('admin', 'host1').negate(originalDelta.id).buildV1();
hyperview.ingestDelta(originalDelta);
hyperview.ingestDelta(negationDelta);
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta);
const negations = hyperview.getNegationDeltas('task1');
const negations = lossless.getNegationDeltas('task1');
expect(negations).toHaveLength(1);
expect(negations[0].id).toBe(negationDelta.id);
expect(negations[0].creator).toBe('admin');
@ -275,7 +275,7 @@ describe('Negation System', () => {
const transactionId = 'tx-negation';
// Create transaction declaration
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
@ -294,11 +294,11 @@ describe('Negation System', () => {
targetContext: 'deltas'
});
hyperview.ingestDelta(originalDelta);
hyperview.ingestDelta(negationDelta);
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negationDelta);
// Transaction should complete, but original delta should be negated
const view = hyperview.compose(['post1']);
const view = lossless.compose(['post1']);
expect(view.post1).toBeUndefined(); // No visible deltas
});
@ -321,11 +321,11 @@ describe('Negation System', () => {
.setProperty('post1', 'content', 'Edited post')
.buildV1();
hyperview.ingestDelta(postDelta);
hyperview.ingestDelta(negationDelta);
hyperview.ingestDelta(editDelta);
lossless.ingestDelta(postDelta);
lossless.ingestDelta(negationDelta);
lossless.ingestDelta(editDelta);
const view = hyperview.compose(['post1']);
const view = lossless.compose(['post1']);
// Should show edited content (edit happened after negation)
expect(view.post1).toBeDefined();
@ -341,10 +341,10 @@ describe('Negation System', () => {
test('should handle negation of non-existent deltas', () => {
const negationDelta = createDelta('moderator', 'host1').negate('non-existent-delta-id').buildV1();
hyperview.ingestDelta(negationDelta);
lossless.ingestDelta(negationDelta);
// Should not crash and stats should reflect the orphaned negation
const stats = hyperview.getNegationStats('entity1');
const stats = lossless.getNegationStats('entity1');
expect(stats.negationDeltas).toBe(0); // No negations for this entity
});
@ -357,16 +357,16 @@ describe('Negation System', () => {
const negationDelta = createDelta('admin', 'host1').negate(selfRefDelta.id).buildV1();
hyperview.ingestDelta(selfRefDelta);
hyperview.ingestDelta(negationDelta);
lossless.ingestDelta(selfRefDelta);
lossless.ingestDelta(negationDelta);
const view = hyperview.compose(['node1']);
const view = lossless.compose(['node1']);
expect(view.node1).toBeUndefined(); // Should be negated
});
test('should handle multiple direct negations of the same delta', () => {
const testNode = new RhizomeNode();
const testHyperview = new Hyperview(testNode);
const testLossless = new Lossless(testNode);
// Create the original delta
const originalDelta = createDelta('user1', 'host1')
@ -378,18 +378,18 @@ describe('Negation System', () => {
const negation2 = createDelta('user3', 'host1').negate(originalDelta.id).buildV1();
// Process all deltas
testHyperview.ingestDelta(originalDelta);
testHyperview.ingestDelta(negation1);
testHyperview.ingestDelta(negation2);
testLossless.ingestDelta(originalDelta);
testLossless.ingestDelta(negation1);
testLossless.ingestDelta(negation2);
// Get the view after processing all deltas
const view = testHyperview.compose(['entity2']);
const view = testLossless.compose(['entity2']);
// The original delta should be negated (not in view) because it has two direct negations
expect(view.entity2).toBeUndefined();
// Verify the stats
const stats = testHyperview.getNegationStats('entity2');
const stats = testLossless.getNegationStats('entity2');
expect(stats.negationDeltas).toBe(2);
expect(stats.negatedDeltas).toBe(1);
expect(stats.effectiveDeltas).toBe(0);
@ -397,7 +397,7 @@ describe('Negation System', () => {
test('should handle complex negation chains', () => {
const testNode = new RhizomeNode();
const testHyperview = new Hyperview(testNode);
const testLossless = new Lossless(testNode);
// Create the original delta
const deltaA = createDelta('user1', 'host1')
@ -415,13 +415,13 @@ describe('Negation System', () => {
debug('Delta D (negates C): %s', deltaD.id);
// Process all deltas in order
testHyperview.ingestDelta(deltaA);
testHyperview.ingestDelta(deltaB);
testHyperview.ingestDelta(deltaC);
testHyperview.ingestDelta(deltaD);
testLossless.ingestDelta(deltaA);
testLossless.ingestDelta(deltaB);
testLossless.ingestDelta(deltaC);
testLossless.ingestDelta(deltaD);
// Get the view after processing all deltas
const view = testHyperview.compose(['entity3']);
const view = testLossless.compose(['entity3']);
// The original delta should be negated because:
// - B negates A
@ -433,7 +433,7 @@ describe('Negation System', () => {
const allDeltas = [deltaA, deltaB, deltaC, deltaD];
// Get the stats
const stats = testHyperview.getNegationStats('entity3');
const stats = testLossless.getNegationStats('entity3');
const isANegated = NegationHelper.isDeltaNegated(deltaA.id, allDeltas);
const isBNegated = NegationHelper.isDeltaNegated(deltaB.id, allDeltas);
const isCNegated = NegationHelper.isDeltaNegated(deltaC.id, allDeltas);
@ -470,7 +470,7 @@ describe('Negation System', () => {
test('should handle multiple independent negations', () => {
const testNode = new RhizomeNode();
const testHyperview = new Hyperview(testNode);
const testLossless = new Lossless(testNode);
// Create two independent deltas
const delta1 = createDelta('user1', 'host1')
@ -486,19 +486,19 @@ describe('Negation System', () => {
const negation2 = createDelta('user4', 'host1').negate(delta2.id).buildV1();
// Process all deltas
testHyperview.ingestDelta(delta1);
testHyperview.ingestDelta(delta2);
testHyperview.ingestDelta(negation1);
testHyperview.ingestDelta(negation2);
testLossless.ingestDelta(delta1);
testLossless.ingestDelta(delta2);
testLossless.ingestDelta(negation1);
testLossless.ingestDelta(negation2);
// Get the view after processing all deltas
const view = testHyperview.compose(['entity4']);
const view = testLossless.compose(['entity4']);
// Both deltas should be negated
expect(view.entity4).toBeUndefined();
// Verify the stats
const stats = testHyperview.getNegationStats('entity4');
const stats = testLossless.getNegationStats('entity4');
expect(stats.negationDeltas).toBe(2);
expect(stats.negatedDeltas).toBe(2);
expect(stats.effectiveDeltas).toBe(0);

View File

@ -1,15 +1,15 @@
import { createDelta } from '@src/core/delta-builder';
import { Hyperview } from '@src/views';
import { Lossless } from '@src/views';
import { RhizomeNode } from '@src/node';
import { DeltaFilter } from '@src/core';
describe('Transactions', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
describe('Transaction-based filtering', () => {
@ -34,12 +34,12 @@ describe('Transactions', () => {
.buildV1();
// Ingest transaction declaration and first two deltas
hyperview.ingestDelta(txDeclaration);
hyperview.ingestDelta(delta1);
hyperview.ingestDelta(delta2);
lossless.ingestDelta(txDeclaration);
lossless.ingestDelta(delta1);
lossless.ingestDelta(delta2);
// View should be empty because transaction is incomplete (2/3 deltas)
const view = hyperview.compose(['user123']);
const view = lossless.compose(['user123']);
expect(view.user123).toBeUndefined();
// Add the third delta to complete the transaction
@ -48,10 +48,10 @@ describe('Transactions', () => {
.setProperty('user123', 'email', 'alice@example.com')
.buildV1();
hyperview.ingestDelta(delta3);
lossless.ingestDelta(delta3);
// Now the view should include all deltas from the completed transaction
const completeView = hyperview.compose(['user123']);
const completeView = lossless.compose(['user123']);
expect(completeView.user123).toBeDefined();
expect(completeView.user123.propertyDeltas.name).toHaveLength(1);
expect(completeView.user123.propertyDeltas.age).toHaveLength(1);
@ -63,57 +63,57 @@ describe('Transactions', () => {
const tx2 = 'tx-002';
// Declare two transactions
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(tx1, 2)
.buildV1()
);
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(tx2, 2)
.buildV1()
);
// Add deltas for both transactions
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.inTransaction(tx1)
.setProperty('order1', 'status', 'pending')
.buildV1()
);
hyperview.ingestDelta(createDelta('user2', 'host2')
lossless.ingestDelta(createDelta('user2', 'host2')
.inTransaction(tx2)
.setProperty('order2', 'status', 'shipped')
.buildV1()
);
// Neither transaction is complete
let view = hyperview.compose(['order1', 'order2']);
let view = lossless.compose(['order1', 'order2']);
expect(view.order1).toBeUndefined();
expect(view.order2).toBeUndefined();
// Complete tx1
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.inTransaction(tx1)
.setProperty('order1', 'total', 100)
.buildV1()
);
// tx1 is complete, tx2 is not
view = hyperview.compose(['order1', 'order2']);
view = lossless.compose(['order1', 'order2']);
expect(view.order1).toBeDefined();
expect(view.order1.propertyDeltas.status).toHaveLength(1);
expect(view.order1.propertyDeltas.total).toHaveLength(1);
expect(view.order2).toBeUndefined();
// Complete tx2
hyperview.ingestDelta(createDelta('user2', 'host2')
lossless.ingestDelta(createDelta('user2', 'host2')
.inTransaction(tx2)
.setProperty('order2', 'tracking', 'TRACK123')
.buildV1()
);
// Both transactions complete
view = hyperview.compose(['order1', 'order2']);
view = lossless.compose(['order1', 'order2']);
expect(view.order1).toBeDefined();
expect(view.order2).toBeDefined();
expect(view.order2.propertyDeltas.status).toHaveLength(1);
@ -124,19 +124,19 @@ describe('Transactions', () => {
const transactionId = 'tx-filter-test';
// Create transaction with 2 deltas
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
// Add both deltas
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('doc1', 'type', 'report')
.buildV1()
);
hyperview.ingestDelta(createDelta('user2', 'host2')
lossless.ingestDelta(createDelta('user2', 'host2')
.inTransaction(transactionId)
.setProperty('doc1', 'author', 'Bob')
.buildV1()
@ -147,7 +147,7 @@ describe('Transactions', () => {
// With incomplete transaction, nothing should show
// But once complete, the filter should still apply
const view = hyperview.compose(['doc1'], userFilter);
const view = lossless.compose(['doc1'], userFilter);
// Even though transaction is complete, only delta from user1 should appear
expect(view.doc1).toBeDefined();
@ -159,13 +159,13 @@ describe('Transactions', () => {
const transactionId = 'tx-multi-entity';
// Transaction that updates multiple entities atomically
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 3)
.buildV1()
);
// Transfer money from account1 to account2
hyperview.ingestDelta(createDelta('bank', 'host1')
lossless.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId)
.addPointer('balance', 'account1', 'balance')
.addPointer('value', 900)
@ -173,7 +173,7 @@ describe('Transactions', () => {
.buildV1()
);
hyperview.ingestDelta(createDelta('bank', 'host1')
lossless.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId)
.addPointer('balance', 'account2', 'balance')
.addPointer('value', 1100)
@ -182,12 +182,12 @@ describe('Transactions', () => {
);
// Transaction incomplete - no entities should show updates
let view = hyperview.compose(['account1', 'account2']);
let view = lossless.compose(['account1', 'account2']);
expect(view.account1).toBeUndefined();
expect(view.account2).toBeUndefined();
// Complete transaction with audit log
hyperview.ingestDelta(createDelta('bank', 'host1')
lossless.ingestDelta(createDelta('bank', 'host1')
.inTransaction(transactionId)
.addPointer('transfer', 'transfer123', 'details')
.addPointer('from', 'account1')
@ -197,7 +197,7 @@ describe('Transactions', () => {
);
// All entities should now be visible
view = hyperview.compose(['account1', 'account2', 'transfer123']);
view = lossless.compose(['account1', 'account2', 'transfer123']);
expect(view.account1).toBeDefined();
expect(view.account1.propertyDeltas.balance).toHaveLength(1);
expect(view.account2).toBeDefined();
@ -211,12 +211,12 @@ describe('Transactions', () => {
const updateEvents: Array<{ entityId: string, deltaIds: string[] }> = [];
// Listen for update events
hyperview.eventStream.on('updated', (entityId: string, deltaIds: string[]) => {
lossless.eventStream.on('updated', (entityId: string, deltaIds: string[]) => {
updateEvents.push({ entityId, deltaIds });
});
// Create transaction
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
@ -226,7 +226,7 @@ describe('Transactions', () => {
.inTransaction(transactionId)
.setProperty('entity1', 'field1', 'value1')
.buildV1();
hyperview.ingestDelta(delta1);
lossless.ingestDelta(delta1);
// No events should be emitted yet
expect(updateEvents).toHaveLength(0);
@ -236,7 +236,7 @@ describe('Transactions', () => {
.inTransaction(transactionId)
.setProperty('entity1', 'field2', 'value2')
.buildV1();
hyperview.ingestDelta(delta2);
lossless.ingestDelta(delta2);
// Wait for async event processing
await new Promise(resolve => setTimeout(resolve, 10));
@ -256,20 +256,20 @@ describe('Transactions', () => {
const transactionId = 'tx-wait';
// Create transaction
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
// Add first delta
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('job1', 'status', 'processing')
.buildV1()
);
// Start waiting for transaction
const waitPromise = hyperview.transactions.waitFor(transactionId);
const waitPromise = lossless.transactions.waitFor(transactionId);
let isResolved = false;
waitPromise.then(() => { isResolved = true; });
@ -278,7 +278,7 @@ describe('Transactions', () => {
expect(isResolved).toBe(false);
// Complete transaction
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('job1', 'status', 'completed')
.buildV1()
@ -289,7 +289,7 @@ describe('Transactions', () => {
expect(isResolved).toBe(true);
// View should show completed transaction
const view = hyperview.compose(['job1']);
const view = lossless.compose(['job1']);
expect(view.job1).toBeDefined();
expect(view.job1.propertyDeltas.status).toHaveLength(2);
});
@ -302,14 +302,14 @@ describe('Transactions', () => {
.buildV1();
const updateEvents: string[] = [];
hyperview.eventStream.on('updated', (entityId: string) => {
lossless.eventStream.on('updated', (entityId: string) => {
updateEvents.push(entityId);
});
hyperview.ingestDelta(regularDelta);
lossless.ingestDelta(regularDelta);
// Should immediately appear in view
const view = hyperview.compose(['user456']);
const view = lossless.compose(['user456']);
expect(view.user456).toBeDefined();
expect(view.user456.propertyDeltas.name).toHaveLength(1);
@ -323,29 +323,29 @@ describe('Transactions', () => {
const transactionId = 'tx-resize';
// Initially declare transaction with size 2
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 2)
.buildV1()
);
// Add 2 deltas
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('cart1', 'items', 'item1')
.buildV1()
);
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('cart1', 'items', 'item2')
.buildV1()
);
// Transaction should be complete
expect(hyperview.transactions.isComplete(transactionId)).toBe(true);
expect(lossless.transactions.isComplete(transactionId)).toBe(true);
// View should show the cart
const view = hyperview.compose(['cart1']);
const view = lossless.compose(['cart1']);
expect(view.cart1).toBeDefined();
});
@ -353,30 +353,30 @@ describe('Transactions', () => {
const transactionId = 'tx-no-size';
// Add delta with transaction reference but no size declaration
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.inTransaction(transactionId)
.setProperty('entity1', 'data', 'test')
.buildV1()
);
// Transaction should not be complete (no size)
expect(hyperview.transactions.isComplete(transactionId)).toBe(false);
expect(lossless.transactions.isComplete(transactionId)).toBe(false);
// Delta should not appear in view
const view = hyperview.compose(['entity1']);
const view = lossless.compose(['entity1']);
expect(view.entity1).toBeUndefined();
// Declare size after the fact
hyperview.ingestDelta(createDelta('system', 'host1')
lossless.ingestDelta(createDelta('system', 'host1')
.declareTransaction(transactionId, 1)
.buildV1()
);
// Now transaction should be complete
expect(hyperview.transactions.isComplete(transactionId)).toBe(true);
expect(lossless.transactions.isComplete(transactionId)).toBe(true);
// And delta should appear in view
const viewAfter = hyperview.compose(['entity1']);
const viewAfter = lossless.compose(['entity1']);
expect(viewAfter.entity1).toBeDefined();
});
});

View File

@ -1,5 +1,5 @@
import { QueryEngine } from '@src/query';
import { Hyperview } from '@src/views';
import { Lossless } from '@src/views';
import { DefaultSchemaRegistry } from '@src/schema';
import { SchemaBuilder, PrimitiveSchemas } from '@src/schema';
import { CommonSchemas } from '../../../util/schemas';
@ -8,7 +8,7 @@ import { RhizomeNode } from '@src/node';
describe('Query Engine', () => {
let queryEngine: QueryEngine;
let hyperview: Hyperview;
let lossless: Lossless;
let schemaRegistry: DefaultSchemaRegistry;
let rhizomeNode: RhizomeNode;
@ -19,9 +19,9 @@ describe('Query Engine', () => {
requestBindPort: 4003
});
hyperview = rhizomeNode.hyperview;
lossless = rhizomeNode.lossless;
schemaRegistry = new DefaultSchemaRegistry();
queryEngine = new QueryEngine(hyperview, schemaRegistry);
queryEngine = new QueryEngine(lossless, schemaRegistry);
// Register test schemas
schemaRegistry.register(CommonSchemas.User());
@ -53,7 +53,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'name', name, 'user')
.buildV1();
hyperview.ingestDelta(nameDelta);
lossless.ingestDelta(nameDelta);
// Add age if provided
if (age !== undefined) {
@ -62,7 +62,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'age', age, 'user')
.buildV1();
hyperview.ingestDelta(ageDelta);
lossless.ingestDelta(ageDelta);
}
// Add email if provided
@ -72,7 +72,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'email', email, 'user')
.buildV1();
hyperview.ingestDelta(emailDelta);
lossless.ingestDelta(emailDelta);
}
}
@ -83,7 +83,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'title', title, 'post')
.buildV1();
hyperview.ingestDelta(titleDelta);
lossless.ingestDelta(titleDelta);
// Author delta
const authorDelta = createDelta('test', 'test-host')
@ -91,7 +91,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'author', author, 'post')
.buildV1();
hyperview.ingestDelta(authorDelta);
lossless.ingestDelta(authorDelta);
// Published delta
const publishedDelta = createDelta('test', 'test-host')
@ -99,7 +99,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'published', published, 'post')
.buildV1();
hyperview.ingestDelta(publishedDelta);
lossless.ingestDelta(publishedDelta);
// Views delta
const viewsDelta = createDelta('test', 'test-host')
@ -107,7 +107,7 @@ describe('Query Engine', () => {
.withTimestamp(Date.now())
.setProperty(id, 'views', views, 'post')
.buildV1();
hyperview.ingestDelta(viewsDelta);
lossless.ingestDelta(viewsDelta);
}
describe('Basic Query Operations', () => {

View File

@ -1,12 +1,12 @@
import {DeltaFilter} from '@src/core';
import {Hyperview} from '@src/views';
import {Lossless} from '@src/views';
import {RhizomeNode} from '@src/node';
import {createDelta} from '@src/core/delta-builder';
describe('Hyperview', () => {
describe('Lossless', () => {
const node = new RhizomeNode();
test('creates a hyperview of keanu as neo in the matrix', () => {
test('creates a lossless view of keanu as neo in the matrix', () => {
const delta = createDelta('a', 'h')
.addPointer('actor', 'keanu', 'roles')
.addPointer('role', 'neo', 'actor')
@ -35,11 +35,11 @@ describe('Hyperview', () => {
target: "usd"
}]);
const hyperview = new Hyperview(node);
const lossless = new Lossless(node);
hyperview.ingestDelta(delta);
lossless.ingestDelta(delta);
expect(hyperview.compose()).toMatchObject({
expect(lossless.compose()).toMatchObject({
keanu: {
referencedAs: ["actor"],
propertyDeltas: {
@ -100,11 +100,11 @@ describe('Hyperview', () => {
.addPointer('salary_currency', 'usd')
.buildV2();
const hyperview = new Hyperview(node);
const lossless = new Lossless(node);
hyperview.ingestDelta(delta);
lossless.ingestDelta(delta);
expect(hyperview.compose()).toMatchObject({
expect(lossless.compose()).toMatchObject({
keanu: {
referencedAs: ["actor"],
propertyDeltas: {
@ -157,18 +157,18 @@ describe('Hyperview', () => {
});
describe('can filter deltas', () => {
const hyperview = new Hyperview(node);
const lossless = new Lossless(node);
beforeAll(() => {
// First delta
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('A', 'H')
.setProperty('ace', 'value', '1', 'ace')
.buildV1()
);
// Second delta
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('B', 'H')
// 10 11j 12q 13k 14a
// .addPointer('14', 'ace', 'value')
@ -176,7 +176,7 @@ describe('Hyperview', () => {
.buildV1()
);
expect(hyperview.compose()).toMatchObject({
expect(lossless.compose()).toMatchObject({
ace: {
referencedAs: ["ace"],
propertyDeltas: {
@ -205,7 +205,7 @@ describe('Hyperview', () => {
return creator === 'A' && host === 'H';
};
expect(hyperview.compose(undefined, filter)).toMatchObject({
expect(lossless.compose(undefined, filter)).toMatchObject({
ace: {
referencedAs: ["ace"],
propertyDeltas: {
@ -221,7 +221,7 @@ describe('Hyperview', () => {
}
});
expect(hyperview.compose(["ace"], filter)).toMatchObject({
expect(lossless.compose(["ace"], filter)).toMatchObject({
ace: {
referencedAs: ["ace"],
propertyDeltas: {
@ -239,18 +239,18 @@ describe('Hyperview', () => {
});
test('filter with transactions', () => {
const hyperviewT = new Hyperview(node);
const losslessT = new Lossless(node);
const transactionId = 'tx-filter-test';
// Declare transaction with 3 deltas
hyperviewT.ingestDelta(
losslessT.ingestDelta(
createDelta('system', 'H')
.declareTransaction(transactionId, 3)
.buildV1()
);
// A1: First delta from creator A
hyperviewT.ingestDelta(
losslessT.ingestDelta(
createDelta('A', 'H')
.inTransaction(transactionId)
.setProperty('process1', 'status', 'started', 'step')
@ -258,7 +258,7 @@ describe('Hyperview', () => {
);
// B: Delta from creator B
hyperviewT.ingestDelta(
losslessT.ingestDelta(
createDelta('B', 'H')
.inTransaction(transactionId)
.setProperty('process1', 'status', 'processing', 'step')
@ -266,11 +266,11 @@ describe('Hyperview', () => {
);
// Transaction incomplete - nothing should show
const incompleteView = hyperviewT.compose(['process1']);
const incompleteView = losslessT.compose(['process1']);
expect(incompleteView.process1).toBeUndefined();
// A2: Second delta from creator A completes transaction
hyperviewT.ingestDelta(
losslessT.ingestDelta(
createDelta('A', 'H')
.inTransaction(transactionId)
.addPointer('step', 'process1', 'status')
@ -279,13 +279,13 @@ describe('Hyperview', () => {
);
// All deltas visible now
const completeView = hyperviewT.compose(['process1']);
const completeView = losslessT.compose(['process1']);
expect(completeView.process1).toBeDefined();
expect(completeView.process1.propertyDeltas.status).toHaveLength(3);
// Filter by creator A only
const filterA: DeltaFilter = ({creator}) => creator === 'A';
const filteredView = hyperviewT.compose(['process1'], filterA);
const filteredView = losslessT.compose(['process1'], filterA);
expect(filteredView.process1).toBeDefined();
expect(filteredView.process1.propertyDeltas.status).toHaveLength(2);

View File

@ -1,12 +1,12 @@
import Debug from 'debug';
import { PointerTarget } from "@src/core/delta";
import { Hyperview, HyperviewOne } from "@src/views/hyperview";
import { Lossy } from "@src/views/view";
import { Lossless, LosslessViewOne } from "@src/views/lossless";
import { Lossy } from "@src/views/lossy";
import { RhizomeNode } from "@src/node";
import { valueFromDelta } from "@src/views/hyperview";
import { valueFromDelta } from "@src/views/lossless";
import { latestFromCollapsedDeltas } from "@src/views/resolvers/timestamp-resolvers";
import { createDelta } from "@src/core/delta-builder";
const debug = Debug('rz:test:view');
const debug = Debug('rz:test:lossy');
type Role = {
actor: PointerTarget,
@ -21,9 +21,9 @@ type Summary = {
class Summarizer extends Lossy<Summary> {
private readonly debug: debug.Debugger;
constructor(hyperview: Hyperview) {
super(hyperview);
this.debug = Debug('rz:test:view:summarizer');
constructor(lossless: Lossless) {
super(lossless);
this.debug = Debug('rz:test:lossy:summarizer');
}
initializer(): Summary {
@ -37,9 +37,9 @@ class Summarizer extends Lossy<Summary> {
// it's really not CRDT, it likely depends on the order of the pointers.
// TODO: Prove with failing test
reducer(acc: Summary, cur: HyperviewOne): Summary {
reducer(acc: Summary, cur: LosslessViewOne): Summary {
this.debug(`Processing view for entity ${cur.id} (referenced as: ${cur.referencedAs?.join(', ')})`);
this.debug(`hyperview:`, JSON.stringify(cur));
this.debug(`lossless view:`, JSON.stringify(cur));
if (cur.referencedAs?.includes("role")) {
this.debug(`Found role entity: ${cur.id}`);
@ -92,12 +92,12 @@ class Summarizer extends Lossy<Summary> {
describe('Lossy', () => {
describe('use a provided initializer, reducer, and resolver to resolve entity views', () => {
const node = new RhizomeNode();
const hyperview = new Hyperview(node);
const lossless = new Lossless(node);
const view = new Summarizer(hyperview);
const lossy = new Summarizer(lossless);
beforeAll(() => {
hyperview.ingestDelta(createDelta('a', 'h')
lossless.ingestDelta(createDelta('a', 'h')
.addPointer('actor', 'keanu', 'roles')
.addPointer('role', 'neo', 'actor')
.addPointer('film', 'the_matrix', 'cast')
@ -108,7 +108,7 @@ describe('Lossy', () => {
});
test('example summary', () => {
const result = view.resolve();
const result = lossy.resolve();
debug('result', result);
expect(result).toEqual({
roles: [{

View File

@ -85,10 +85,10 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('salary', 15000000)
.addPointer('contract_date', '1999-03-31')
.buildV1();
node.hyperview.ingestDelta(castingDelta);
node.lossless.ingestDelta(castingDelta);
// Test from Keanu's perspective
const keanuViews = node.hyperview.compose(['keanu']);
const keanuViews = node.lossless.compose(['keanu']);
const keanuView = keanuViews['keanu'];
expect(keanuView.propertyDeltas.filmography).toBeDefined();
@ -97,7 +97,7 @@ describe('Multi-Pointer Delta Resolution', () => {
const nestedKeanuView = schemaRegistry.applySchemaWithNesting(
keanuView,
'actor',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -117,13 +117,13 @@ describe('Multi-Pointer Delta Resolution', () => {
}
// Test from Matrix's perspective
const matrixViews = node.hyperview.compose(['matrix']);
const matrixViews = node.lossless.compose(['matrix']);
const matrixView = matrixViews['matrix'];
const nestedMatrixView = schemaRegistry.applySchemaWithNesting(
matrixView,
'movie',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -169,16 +169,16 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('since', '2020-01-15')
.addPointer('intensity', 8)
.buildV1();
node.hyperview.ingestDelta(relationshipDelta);
node.lossless.ingestDelta(relationshipDelta);
// Test from Alice's perspective
const aliceViews = node.hyperview.compose(['alice']);
const aliceViews = node.lossless.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedAliceView = schemaRegistry.applySchemaWithNesting(
aliceView,
'person',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -244,16 +244,16 @@ describe('Multi-Pointer Delta Resolution', () => {
.addPointer('budget', 50000)
.addPointer('deadline', '2024-06-01')
.buildV1();
node.hyperview.ingestDelta(collaborationDelta);
node.lossless.ingestDelta(collaborationDelta);
// Test from project's perspective
const projectViews = node.hyperview.compose(['website']);
const projectViews = node.lossless.compose(['website']);
const projectView = projectViews['website'];
const nestedProjectView = schemaRegistry.applySchemaWithNesting(
projectView,
'project',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);

View File

@ -59,10 +59,10 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob')
.buildV1();
node.hyperview.ingestDelta(friendshipDelta);
node.lossless.ingestDelta(friendshipDelta);
// Get Alice's hyperview
const aliceViews = node.hyperview.compose(['alice']);
// Get Alice's lossless view
const aliceViews = node.lossless.compose(['alice']);
const aliceView = aliceViews['alice'];
expect(aliceView).toBeDefined();
@ -73,7 +73,7 @@ describe('Nested Object Resolution', () => {
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -107,15 +107,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'nonexistent')
.buildV1();
node.hyperview.ingestDelta(friendshipDelta);
node.lossless.ingestDelta(friendshipDelta);
const aliceViews = node.hyperview.compose(['alice']);
const aliceViews = node.lossless.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -158,23 +158,23 @@ describe('Nested Object Resolution', () => {
.addPointer('deep-users', 'alice', 'mentor')
.addPointer('mentor', 'bob')
.buildV1();
node.hyperview.ingestDelta(mentorshipDelta1);
node.lossless.ingestDelta(mentorshipDelta1);
// Bob's mentor is Charlie
const mentorshipDelta2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('deep-users', 'bob', 'mentor')
.addPointer('mentor', 'charlie')
.buildV1();
node.hyperview.ingestDelta(mentorshipDelta2);
node.lossless.ingestDelta(mentorshipDelta2);
const aliceViews = node.hyperview.compose(['alice']);
const aliceViews = node.lossless.compose(['alice']);
const aliceView = aliceViews['alice'];
// Test with maxDepth = 1 (should only resolve Alice and Bob)
const shallowView = schemaRegistry.applySchemaWithNesting(
aliceView,
'deep-user',
node.hyperview,
node.lossless,
{ maxDepth: 1 }
);
@ -197,7 +197,7 @@ describe('Nested Object Resolution', () => {
const deepView = schemaRegistry.applySchemaWithNesting(
aliceView,
'deep-user',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -234,22 +234,22 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob')
.buildV1();
node.hyperview.ingestDelta(friendship1);
node.lossless.ingestDelta(friendship1);
const friendship2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('users', 'bob', 'friends')
.addPointer('friends', 'alice')
.buildV1();
node.hyperview.ingestDelta(friendship2);
node.lossless.ingestDelta(friendship2);
const aliceViews = node.hyperview.compose(['alice']);
const aliceViews = node.lossless.compose(['alice']);
const aliceView = aliceViews['alice'];
// Should handle circular reference without infinite recursion
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.hyperview,
node.lossless,
{ maxDepth: 3 }
);
@ -275,15 +275,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'alice')
.buildV1();
node.hyperview.ingestDelta(selfFriendship);
node.lossless.ingestDelta(selfFriendship);
const aliceViews = node.hyperview.compose(['alice']);
const aliceViews = node.lossless.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -311,21 +311,21 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob')
.buildV1();
node.hyperview.ingestDelta(friendship1);
node.lossless.ingestDelta(friendship1);
const friendship2 = createDelta(node.config.creator, node.config.peerId)
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'charlie')
.buildV1();
node.hyperview.ingestDelta(friendship2);
node.lossless.ingestDelta(friendship2);
const aliceViews = node.hyperview.compose(['alice']);
const aliceViews = node.lossless.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.hyperview,
node.lossless,
{ maxDepth: 2 }
);
@ -373,15 +373,15 @@ describe('Nested Object Resolution', () => {
.addPointer('users', 'alice', 'friends')
.addPointer('friends', 'bob')
.buildV1();
node.hyperview.ingestDelta(friendship);
node.lossless.ingestDelta(friendship);
const aliceViews = node.hyperview.compose(['alice']);
const aliceViews = node.lossless.compose(['alice']);
const aliceView = aliceViews['alice'];
const nestedView = schemaRegistry.applySchemaWithNesting(
aliceView,
'user',
node.hyperview,
node.lossless,
{ maxDepth: 3 }
);

View File

@ -1,6 +1,6 @@
import {
RhizomeNode,
Hyperview,
Lossless,
AggregationResolver,
MinResolver,
MaxResolver,
@ -13,31 +13,31 @@ import { createDelta } from "@src/core/delta-builder";
describe('Aggregation Resolvers', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
describe('Basic Aggregation', () => {
test('should aggregate numbers using min resolver', () => {
const minResolver = new MinResolver(hyperview, ['score']);
const minResolver = new MinResolver(lossless, ['score']);
// Add first entity with score 10
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection')
.buildV1()
);
// Add second entity with score 5
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 5, 'collection')
.buildV1()
);
// Add third entity with score 15
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity3', 'score', 15, 'collection')
.buildV1()
);
@ -52,20 +52,20 @@ describe('Aggregation Resolvers', () => {
});
test('should aggregate numbers using max resolver', () => {
const maxResolver = new MaxResolver(hyperview, ['score']);
const maxResolver = new MaxResolver(lossless, ['score']);
// Add deltas for entities
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 5, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity3', 'score', 15, 'collection')
.buildV1()
);
@ -79,22 +79,22 @@ describe('Aggregation Resolvers', () => {
});
test('should aggregate numbers using sum resolver', () => {
const sumResolver = new SumResolver(hyperview, ['value']);
const sumResolver = new SumResolver(lossless, ['value']);
// Add first value for entity1
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection')
.buildV1()
);
// Add second value for entity1 (should sum)
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 20, 'collection')
.buildV1()
);
// Add value for entity2
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'value', 5, 'collection')
.buildV1()
);
@ -107,21 +107,21 @@ describe('Aggregation Resolvers', () => {
});
test('should aggregate numbers using average resolver', () => {
const avgResolver = new AverageResolver(hyperview, ['score']);
const avgResolver = new AverageResolver(lossless, ['score']);
// Add multiple scores for entity1
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 20, 'collection')
.buildV1()
);
// Single value for entity2
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'score', 30, 'collection')
.buildV1()
);
@ -134,21 +134,21 @@ describe('Aggregation Resolvers', () => {
});
test('should count values using count resolver', () => {
const countResolver = new CountResolver(hyperview, ['visits']);
const countResolver = new CountResolver(lossless, ['visits']);
// Add multiple visit deltas for entity1
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'visits', 1, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'visits', 1, 'collection')
.buildV1()
);
// Single visit for entity2
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity2', 'visits', 1, 'collection')
.buildV1()
);
@ -163,40 +163,40 @@ describe('Aggregation Resolvers', () => {
describe('Custom Aggregation Configuration', () => {
test('should handle mixed aggregation types', () => {
const resolver = new AggregationResolver(hyperview, {
const resolver = new AggregationResolver(lossless, {
min_val: 'min' as AggregationType,
max_val: 'max' as AggregationType,
sum_val: 'sum' as AggregationType
});
// Add first set of values
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'min_val', 10, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'max_val', 5, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'sum_val', 3, 'collection')
.buildV1()
);
// Add second set of values
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'min_val', 5, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'max_val', 15, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'sum_val', 7, 'collection')
.buildV1()
);
@ -212,25 +212,25 @@ describe('Aggregation Resolvers', () => {
});
test('should ignore non-numeric values', () => {
const resolver = new AggregationResolver(hyperview, {
const resolver = new AggregationResolver(lossless, {
score: 'sum' as AggregationType,
name: 'count' as AggregationType
});
// Add numeric value
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 10, 'collection')
.buildV1()
);
// Add non-numeric value (string)
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'name', 'test', 'collection')
.buildV1()
);
// Add another numeric value
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'score', 20, 'collection')
.buildV1()
);
@ -244,9 +244,9 @@ describe('Aggregation Resolvers', () => {
});
test('should handle empty value arrays', () => {
const sumResolver = new SumResolver(hyperview, ['score']);
const sumResolver = new SumResolver(lossless, ['score']);
// Create entity with non-aggregated property
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'name', 'test', 'collection')
.buildV1()
);
@ -261,9 +261,9 @@ describe('Aggregation Resolvers', () => {
describe('Edge Cases', () => {
test('should handle single value aggregations', () => {
const avgResolver = new AverageResolver(hyperview, ['value']);
const avgResolver = new AverageResolver(lossless, ['value']);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 42, 'collection')
.buildV1()
);
@ -275,14 +275,14 @@ describe('Aggregation Resolvers', () => {
});
test('should handle zero values', () => {
const sumResolver = new SumResolver(hyperview, ['value']);
const sumResolver = new SumResolver(lossless, ['value']);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 0, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection')
.buildV1()
);
@ -294,14 +294,14 @@ describe('Aggregation Resolvers', () => {
});
test('should handle negative values', () => {
const minResolver = new MinResolver(hyperview, ['value']);
const minResolver = new MinResolver(lossless, ['value']);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', -5, 'collection')
.buildV1()
);
hyperview.ingestDelta(createDelta('test', 'host1')
lossless.ingestDelta(createDelta('test', 'host1')
.setProperty('entity1', 'value', 10, 'collection')
.buildV1()
);

View File

@ -1,17 +1,17 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Hyperview, createDelta } from '@src';
import { CollapsedDelta } from '@src/views/hyperview';
import { RhizomeNode, Lossless, createDelta } from '@src';
import { CollapsedDelta } from '@src/views/lossless';
import { CustomResolver, ResolverPlugin } from '@src/views/resolvers/custom-resolvers';
type PropertyTypes = string | number | boolean | null;
describe('Basic Dependency Resolution', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
test('should resolve dependencies in correct order', () => {
@ -51,20 +51,20 @@ describe('Basic Dependency Resolution', () => {
}
}
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
first: new FirstPlugin(),
second: new SecondPlugin()
});
// Add some data
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test1', 'first', 'hello', 'test')
.buildV1()
);
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(2000)
.setProperty('test1', 'second', 'world', 'test')

View File

@ -1,17 +1,17 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Hyperview } from '@src';
import { CollapsedDelta } from '@src/views/hyperview';
import { RhizomeNode, Lossless } from '@src';
import { CollapsedDelta } from '@src/views/lossless';
import { CustomResolver, ResolverPlugin } from '@src/views/resolvers/custom-resolvers';
type PropertyTypes = string | number | boolean | null;
describe('Circular Dependency Detection', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
test('should detect circular dependencies', () => {
@ -53,7 +53,7 @@ describe('Circular Dependency Detection', () => {
// Should throw an error when circular dependencies are detected
expect(() => {
new CustomResolver(hyperview, {
new CustomResolver(lossless, {
'a': new PluginA(),
'b': new PluginB()
});
@ -84,7 +84,7 @@ describe('Circular Dependency Detection', () => {
// Should detect the circular dependency: a -> c -> b -> a
expect(() => {
new CustomResolver(hyperview, {
new CustomResolver(lossless, {
'a': new PluginA(),
'b': new PluginB(),
'c': new PluginC()

View File

@ -1,5 +1,5 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Hyperview, createDelta, CollapsedDelta } from '@src';
import { RhizomeNode, Lossless, createDelta, CollapsedDelta } from '@src';
import {
CustomResolver,
DependencyStates,
@ -9,11 +9,11 @@ import { PropertyTypes } from '@src/core/types';
describe('Edge Cases', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
test('should handle null values', () => {
@ -45,11 +45,11 @@ describe('Edge Cases', () => {
}
}
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
value: new NullSafeLastWriteWinsPlugin()
});
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test2', 'value', null, 'test')
@ -95,19 +95,19 @@ describe('Edge Cases', () => {
}
}
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
value: new ConcurrentUpdatePlugin()
});
// Two updates with the same timestamp
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test2', 'value', null, 'test')
.buildV1()
);
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user2', 'host2')
.withTimestamp(1000) // Same timestamp
.setProperty('test2', 'value', 'xylophone', 'test')
@ -149,13 +149,13 @@ describe('Edge Cases', () => {
}
}
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
counter: new CounterPlugin()
});
// Add 1000 updates
for (let i = 0; i < 1000; i++) {
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000 + i)
.setProperty('test3', 'counter', i, 'test')
@ -193,7 +193,7 @@ describe('Edge Cases', () => {
}
}
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
missing: new MissingPropertyPlugin()
});

View File

@ -1,6 +1,6 @@
import { PropertyID } from '@src/core/types';
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Hyperview, createDelta } from '@src';
import { RhizomeNode, Lossless, createDelta } from '@src';
import {
CustomResolver,
LastWriteWinsPlugin,
@ -10,7 +10,7 @@ import {
ResolverPlugin
} from '@src/views/resolvers/custom-resolvers';
import Debug from 'debug';
const debug = Debug('rz:test:hyperview');
const debug = Debug('rz:test:lossless');
// A simple plugin that depends on other plugins
class AveragePlugin<Targets extends PropertyID> extends ResolverPlugin<{ initialized: boolean }> {
@ -49,15 +49,15 @@ class AveragePlugin<Targets extends PropertyID> extends ResolverPlugin<{ initial
describe('Multiple Plugins Integration', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
test('should handle multiple plugins with dependencies', () => {
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
temperature: new LastWriteWinsPlugin(),
maxTemp: new MaxPlugin('temperature'),
minTemp: new MinPlugin('temperature'),
@ -67,7 +67,7 @@ describe('Multiple Plugins Integration', () => {
// Add some temperature readings
const readings = [22, 25, 18, 30, 20];
readings.forEach((temp, index) => {
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('sensor1', 'host1')
.withTimestamp(1000 + index * 1000)
.setProperty('room1', 'temperature', temp, 'sensors')
@ -89,7 +89,7 @@ describe('Multiple Plugins Integration', () => {
});
test('should handle multiple entities with different plugins', () => {
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
name: new LastWriteWinsPlugin(),
tags: new ConcatenationPlugin(),
score: new MaxPlugin()
@ -97,7 +97,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting first delta`);
// Add data for entity1
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'name', 'Test Entity', 'test-name')
@ -107,7 +107,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting second delta`);
// Add more tags to entity1
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(2000)
.setProperty('entity1', 'tags', 'tag2', 'test')
@ -116,7 +116,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting third delta`);
// Add data for entity2
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity2', 'score', 85, 'test')
@ -125,7 +125,7 @@ describe('Multiple Plugins Integration', () => {
debug(`Creating and ingesting fourth delta`);
// Update score for entity2
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(2000)
.setProperty('entity2', 'score', 90, 'test')

View File

@ -1,5 +1,5 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode, Hyperview, createDelta, CollapsedDelta } from '@src';
import { RhizomeNode, Lossless, createDelta, CollapsedDelta } from '@src';
import {
CustomResolver,
ResolverPlugin,
@ -51,20 +51,20 @@ type LifecycleTestState = {
describe('Plugin Lifecycle', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
test('should call initialize, update, and resolve in order', () => {
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
test: new LifecycleTestPlugin()
});
// Add some data
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test1', 'test', 'value1')
@ -92,12 +92,12 @@ describe('Plugin Lifecycle', () => {
});
test('should handle multiple updates correctly', () => {
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
test: new LifecycleTestPlugin()
});
// First update
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('test2', 'test', 'value1')
@ -105,7 +105,7 @@ describe('Plugin Lifecycle', () => {
);
// Second update
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(2000)
.setProperty('test2', 'test', 'value2')
@ -132,7 +132,7 @@ describe('Plugin Lifecycle', () => {
});
test('should handle empty state', () => {
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
test: new LifecycleTestPlugin()
});

View File

@ -1,7 +1,7 @@
import { describe, test, expect } from '@jest/globals';
import { ResolverPlugin, DependencyStates } from '@src/views/resolvers/custom-resolvers';
import { PropertyTypes } from '@src/core/types';
import type { CollapsedDelta } from '@src/views/hyperview';
import type { CollapsedDelta } from '@src/views/lossless';
import { testResolverWithPlugins, createTestDelta } from '@test-helpers/resolver-test-helper';
class CountPlugin extends ResolverPlugin<{ count: number }> {

View File

@ -1,6 +1,6 @@
import { describe, test, expect, beforeEach } from '@jest/globals';
import { RhizomeNode } from '@src';
import { Hyperview } from '@src/views/hyperview';
import { Lossless } from '@src/views/lossless';
import { CustomResolver } from '@src/views/resolvers/custom-resolvers';
import { ResolverPlugin } from '@src/views/resolvers/custom-resolvers/plugin';
// import Debug from 'debug';
@ -25,11 +25,11 @@ class TestPlugin extends ResolverPlugin<unknown> {
describe('CustomResolver', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
describe('buildDependencyGraph', () => {
@ -42,7 +42,7 @@ describe('CustomResolver', () => {
};
// Act
const resolver = new CustomResolver(hyperview, plugins);
const resolver = new CustomResolver(lossless, plugins);
const graph = resolver.dependencyGraph;
@ -65,7 +65,7 @@ describe('CustomResolver', () => {
};
// Act
const resolver = new CustomResolver(hyperview, plugins);
const resolver = new CustomResolver(lossless, plugins);
// Access private method for testing
const graph = resolver.dependencyGraph;
@ -86,7 +86,7 @@ describe('CustomResolver', () => {
// Act & Assert
expect(() => {
new CustomResolver(hyperview, plugins);
new CustomResolver(lossless, plugins);
}).toThrow('Dependency nonexistent not found for plugin a');
});
@ -99,7 +99,7 @@ describe('CustomResolver', () => {
};
// Act
const resolver = new CustomResolver(hyperview, plugins);
const resolver = new CustomResolver(lossless, plugins);
// Access private method for testing
const graph = resolver.dependencyGraph;
@ -125,7 +125,7 @@ describe('CustomResolver', () => {
// Act & Assert
expect(() => {
new CustomResolver(hyperview, plugins);
new CustomResolver(lossless, plugins);
}).toThrow('Circular dependency detected in plugin dependencies');
});
});

View File

@ -1,6 +1,6 @@
import Debug from "debug";
import { createDelta } from '@src/core/delta-builder';
import { Hyperview, RhizomeNode } from '@src';
import { Lossless, RhizomeNode } from '@src';
import { TimestampResolver } from '@src/views/resolvers/timestamp-resolvers';
const debug = Debug('rz:test:last-write-wins');
@ -11,24 +11,24 @@ describe('Last write wins', () => {
describe('given that two separate writes occur', () => {
const node = new RhizomeNode();
const hyperview = new Hyperview(node);
const lossless = new Lossless(node);
const view = new TimestampResolver(hyperview);
const lossy = new TimestampResolver(lossless);
beforeAll(() => {
hyperview.ingestDelta(createDelta('a', 'h')
lossless.ingestDelta(createDelta('a', 'h')
.setProperty('broccoli', 'want', 95, 'vegetable')
.buildV1()
);
hyperview.ingestDelta(createDelta('a', 'h')
lossless.ingestDelta(createDelta('a', 'h')
.setProperty('broccoli', 'want', 90, 'vegetable')
.buildV1()
);
});
test('our resolver should return the most recently written value', () => {
const result = view.resolve(["broccoli"]);
const result = lossy.resolve(["broccoli"]);
debug('result', result);
expect(result).toMatchObject({
broccoli: {

View File

@ -1,5 +1,5 @@
import { RhizomeNode, Hyperview, createDelta } from "@src";
import { CollapsedDelta } from "@src/views/hyperview";
import { RhizomeNode, Lossless, createDelta } from "@src";
import { CollapsedDelta } from "@src/views/lossless";
import {
CustomResolver,
ResolverPlugin,
@ -10,11 +10,11 @@ import { PropertyTypes } from '@src/core/types';
describe('State Visibility', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
// A test plugin that records which states it sees
@ -88,10 +88,10 @@ describe('State Visibility', () => {
prop2: spy2
} as const;
const resolver = new CustomResolver(hyperview, config);
const resolver = new CustomResolver(lossless, config);
// Add some data
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'prop1', 'value1', 'entity-prop1')
@ -127,10 +127,10 @@ describe('State Visibility', () => {
dependsOn: dependency
} as const;
const resolver = new CustomResolver(hyperview, config);
const resolver = new CustomResolver(lossless, config);
// Add some data
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'dependsOn', 'baseValue', 'prop1')
@ -157,14 +157,14 @@ describe('State Visibility', () => {
const lastWrite = new LastWriteWinsPlugin();
const other = new LastWriteWinsPlugin();
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(lossless, {
dependent: dependent,
dependsOn: lastWrite,
other: other // Not declared as a dependency
});
// Add some data
hyperview.ingestDelta(
lossless.ingestDelta(
createDelta('user1', 'host1')
.withTimestamp(1000)
.setProperty('entity1', 'dependsOn', 'baseValue', 'prop1')
@ -214,7 +214,7 @@ describe('State Visibility', () => {
}
expect(() => {
new CustomResolver(hyperview, {
new CustomResolver(lossless, {
bad: new PluginWithBadDeps()
});
}).toThrow("Dependency nonexistent not found for plugin bad");

View File

@ -1,6 +1,6 @@
import {
RhizomeNode,
Hyperview,
Lossless,
TimestampResolver,
CreatorIdTimestampResolver,
DeltaIdTimestampResolver,
@ -13,19 +13,19 @@ const debug = Debug('rz:test:timestamp-resolvers');
describe('Timestamp Resolvers', () => {
let node: RhizomeNode;
let hyperview: Hyperview;
let lossless: Lossless;
beforeEach(() => {
node = new RhizomeNode();
hyperview = new Hyperview(node);
lossless = new Lossless(node);
});
describe('Basic Timestamp Resolution', () => {
test('should resolve by most recent timestamp', () => {
const resolver = new TimestampResolver(hyperview);
const resolver = new TimestampResolver(lossless);
// Add older delta
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -34,7 +34,7 @@ describe('Timestamp Resolvers', () => {
);
// Add newer delta
hyperview.ingestDelta(createDelta('user2', 'host2')
lossless.ingestDelta(createDelta('user2', 'host2')
.withId('delta2')
.withTimestamp(2000)
.addPointer('collection', 'entity1', 'score')
@ -50,10 +50,10 @@ describe('Timestamp Resolvers', () => {
});
test('should handle multiple entities with different timestamps', () => {
const resolver = new TimestampResolver(hyperview);
const resolver = new TimestampResolver(lossless);
// Entity1 - older value
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'value')
.addPointer('value', 100)
@ -61,7 +61,7 @@ describe('Timestamp Resolvers', () => {
);
// Entity2 - newer value
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withTimestamp(2000)
.addPointer('collection', 'entity2', 'value')
.addPointer('value', 200)
@ -78,10 +78,10 @@ describe('Timestamp Resolvers', () => {
describe('Tie-Breaking Strategies', () => {
test('should break ties using creator-id strategy', () => {
const resolver = new CreatorIdTimestampResolver(hyperview);
const resolver = new CreatorIdTimestampResolver(lossless);
// Two deltas with same timestamp, different creators
hyperview.ingestDelta(createDelta('user_z', 'host1')
lossless.ingestDelta(createDelta('user_z', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -89,7 +89,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
hyperview.ingestDelta(createDelta('user_a', 'host1')
lossless.ingestDelta(createDelta('user_a', 'host1')
.withId('delta2')
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score')
@ -105,10 +105,10 @@ describe('Timestamp Resolvers', () => {
});
test('should break ties using delta-id strategy', () => {
const resolver = new DeltaIdTimestampResolver(hyperview);
const resolver = new DeltaIdTimestampResolver(lossless);
// Two deltas with same timestamp, different delta IDs
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta_a') // Lexicographically earlier
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -116,7 +116,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta_z') // Lexicographically later
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score')
@ -132,10 +132,10 @@ describe('Timestamp Resolvers', () => {
});
test('should break ties using host-id strategy', () => {
const resolver = new HostIdTimestampResolver(hyperview);
const resolver = new HostIdTimestampResolver(lossless);
// Two deltas with same timestamp, different hosts
hyperview.ingestDelta(createDelta('user1', 'host_z') // Lexicographically later
lossless.ingestDelta(createDelta('user1', 'host_z') // Lexicographically later
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -143,7 +143,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
hyperview.ingestDelta(createDelta('user1', 'host_a') // Lexicographically earlier
lossless.ingestDelta(createDelta('user1', 'host_a') // Lexicographically earlier
.withId('delta2')
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score')
@ -159,10 +159,10 @@ describe('Timestamp Resolvers', () => {
});
test('should break ties using lexicographic strategy with string values', () => {
const resolver = new LexicographicTimestampResolver(hyperview);
const resolver = new LexicographicTimestampResolver(lossless);
// Two deltas with same timestamp, different string values
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'name')
@ -170,7 +170,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta2')
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'name')
@ -186,10 +186,10 @@ describe('Timestamp Resolvers', () => {
});
test('should break ties using lexicographic strategy with numeric values (falls back to delta ID)', () => {
const resolver = new LexicographicTimestampResolver(hyperview);
const resolver = new LexicographicTimestampResolver(lossless);
// Two deltas with same timestamp, numeric values (should fall back to delta ID comparison)
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta_a') // Lexicographically earlier
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'score')
@ -197,7 +197,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta_z') // Lexicographically later
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'score')
@ -215,11 +215,11 @@ describe('Timestamp Resolvers', () => {
describe('Complex Tie-Breaking Scenarios', () => {
test('should handle multiple properties with different tie-breaking outcomes', () => {
const creatorResolver = new CreatorIdTimestampResolver(hyperview);
const deltaResolver = new DeltaIdTimestampResolver(hyperview);
const creatorResolver = new CreatorIdTimestampResolver(lossless);
const deltaResolver = new DeltaIdTimestampResolver(lossless);
// Add deltas for multiple properties with same timestamp
hyperview.ingestDelta(createDelta('user_a', 'host1')
lossless.ingestDelta(createDelta('user_a', 'host1')
.withId('delta_z')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'name')
@ -227,7 +227,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
hyperview.ingestDelta(createDelta('user_z', 'host1')
lossless.ingestDelta(createDelta('user_z', 'host1')
.withId('delta_a')
.withTimestamp(1000) // Same timestamp
.addPointer('collection', 'entity1', 'name')
@ -249,10 +249,10 @@ describe('Timestamp Resolvers', () => {
});
test('should work consistently with timestamp priority over tie-breaking', () => {
const resolver = new CreatorIdTimestampResolver(hyperview);
const resolver = new CreatorIdTimestampResolver(lossless);
// Add older delta with "better" tie-breaking attributes
hyperview.ingestDelta(createDelta('user_z', 'host1')
lossless.ingestDelta(createDelta('user_z', 'host1')
.withId('delta_z') // Would win in delta ID tie-breaking
.withTimestamp(1000) // Older timestamp
.addPointer('collection', 'entity1', 'score')
@ -261,7 +261,7 @@ describe('Timestamp Resolvers', () => {
);
// Add newer delta with "worse" tie-breaking attributes
hyperview.ingestDelta(createDelta('user_a', 'host1')
lossless.ingestDelta(createDelta('user_a', 'host1')
.withId('delta_a') // Would lose in delta ID tie-breaking
.withTimestamp(2000) // Newer timestamp
.addPointer('collection', 'entity1', 'score')
@ -279,8 +279,8 @@ describe('Timestamp Resolvers', () => {
describe('Edge Cases', () => {
test('should handle single delta correctly', () => {
const resolver = new TimestampResolver(hyperview, 'creator-id');
hyperview.ingestDelta(createDelta('user1', 'host1')
const resolver = new TimestampResolver(lossless, 'creator-id');
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'value')
@ -295,9 +295,9 @@ describe('Timestamp Resolvers', () => {
});
test('should handle mixed value types correctly', () => {
const resolver = new TimestampResolver(hyperview);
const resolver = new TimestampResolver(lossless);
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta1')
.withTimestamp(1000)
.addPointer('collection', 'entity1', 'name')
@ -305,7 +305,7 @@ describe('Timestamp Resolvers', () => {
.buildV1()
);
hyperview.ingestDelta(createDelta('user1', 'host1')
lossless.ingestDelta(createDelta('user1', 'host1')
.withId('delta2')
.withTimestamp(1001)
.addPointer('collection', 'entity1', 'score')

View File

@ -11,7 +11,7 @@ classDiagram
-requestReply: RequestReply
-httpServer: HttpServer
-deltaStream: DeltaStream
-hyperview: Hyperview
-lossless: Lossless
-peers: Peers
-queryEngine: QueryEngine
-storageQueryEngine: StorageQueryEngine
@ -27,15 +27,15 @@ classDiagram
+pointers: PointerV1[]
}
class Hyperview {
-domainEntities: Map<DomainEntityID, HyperviewEntity>
class Lossless {
-domainEntities: Map<DomainEntityID, LosslessEntity>
-transactions: Transactions
+view(ids: DomainEntityID[]): HyperviewMany
+compose(ids: DomainEntityID[]): HyperviewMany
+view(ids: DomainEntityID[]): LosslessViewMany
+compose(ids: DomainEntityID[]): LosslessViewMany
}
class QueryEngine {
-hyperview: Hyperview
-lossless: Lossless
-schemaRegistry: SchemaRegistry
+query(schemaId: SchemaID, filter?: JsonLogic): Promise<SchemaAppliedViewWithNesting[]>
}
@ -69,23 +69,23 @@ classDiagram
%% Relationships
RhizomeNode --> DeltaStream
RhizomeNode --> Hyperview
RhizomeNode --> Lossless
RhizomeNode --> QueryEngine
RhizomeNode --> StorageQueryEngine
RhizomeNode --> SchemaRegistry
RhizomeNode --> DeltaStorage
Hyperview --> Transactions
Hyperview --> HyperviewEntity
Lossless --> Transactions
Lossless --> LosslessEntity
QueryEngine --> SchemaRegistry
QueryEngine --> Hyperview
QueryEngine --> Lossless
StorageQueryEngine --> DeltaStorage
StorageQueryEngine --> SchemaRegistry
DeltaStream --> Delta
Hyperview --> Delta
Lossless --> Delta
DockerOrchestrator --> ContainerManager
DockerOrchestrator --> NetworkManager
@ -103,7 +103,7 @@ classDiagram
- Represents atomic changes in the system
- Contains pointers to entities and their properties
3. **Hyperview**: Manages the hyperview of data
3. **Lossless**: Manages the lossless view of data
- Maintains the complete history of deltas
- Provides methods to view and compose entity states
@ -126,7 +126,7 @@ classDiagram
## Data Flow
1. Deltas are received through the DeltaStream
2. Hyperview processes and stores these deltas
2. Lossless processes and stores these deltas
3. Queries can be made through either QueryEngine (in-memory) or StorageQueryEngine (persisted)
4. The system maintains consistency through the schema system
5. In distributed mode, DockerOrchestrator manages multiple node instances

View File

@ -10,11 +10,11 @@ The `CustomResolver` class is the main entry point for the Custom Resolver syste
class CustomResolver {
/**
* Creates a new CustomResolver instance
* @param view The hyperview to resolve
* @param view The lossless view to resolve
* @param config Plugin configuration
*/
constructor(
private readonly view: Hyperview,
private readonly view: LosslessView,
private readonly config: ResolverConfig
);
@ -48,7 +48,7 @@ class CustomResolver {
Creates a new instance of the CustomResolver.
**Parameters:**
- `view: Hyperview` - The hyperview containing the data to resolve
- `view: LosslessView` - The lossless view containing the data to resolve
- `config: ResolverConfig` - Configuration object mapping property IDs to their resolver plugins
**Example:**
@ -146,10 +146,10 @@ The resolver may throw the following errors:
```typescript
import { CustomResolver, LastWriteWinsPlugin } from './resolver';
import { Hyperview } from '../hyperview-view';
import { LosslessView } from '../lossless-view';
// Create a hyperview with some data
const view = new Hyperview();
// Create a lossless view with some data
const view = new LosslessView();
// ... add data to the view ...
// Configure the resolver

View File

@ -22,26 +22,22 @@ The `CustomResolver` system provides a flexible framework for resolving property
## Getting Started
```typescript
import { CustomResolver } from '../src/views/resolvers/custom-resolvers';
import { LastWriteWinsPlugin } from '../src/views/resolvers/custom-resolvers/plugins';
import { Hyperview } from '../src/views/hyperview';
import { CustomResolver, LastWriteWinsPlugin } from './resolver';
import { LosslessView } from '../lossless-view';
// Create a hyperview
const hyperview = new Hyperview();
// Create a lossless view
const view = new LosslessView();
// Create a resolver with a last-write-wins strategy
const resolver = new CustomResolver(hyperview, {
const resolver = new CustomResolver(view, {
myProperty: new LastWriteWinsPlugin()
});
// Process updates through the hyperview
// hyperview.applyDelta(delta);
// Process updates
// ...
// Get resolved values for specific entities
const result = resolver.resolve(['entity1', 'entity2']);
// Or get all resolved values
const allResults = resolver.resolveAll();
// Get resolved values
const result = resolver.resolve();
```
## Next Steps

View File

@ -78,11 +78,11 @@ Create tests to verify your plugin's behavior:
```typescript
describe('DiscountedPricePlugin', () => {
let view: Hyperview;
let view: LosslessView;
let resolver: CustomResolver;
beforeEach(() => {
view = new Hyperview();
view = new LosslessView();
resolver = new CustomResolver(view, {
basePrice: new LastWriteWinsPlugin(),
discount: new LastWriteWinsPlugin(),

View File

@ -1,150 +1,15 @@
# Views and Resolvers
# Resolvers (Views)
## Core Concepts
The workhorse of this system is likely going to be our lossy views.
This is where the computation likely generally occurs.
### Views
So, let's talk about how to create a view.
A `View` (previously known as `Lossy`) is a derived, computed representation of your data that provides efficient access to resolved values. It's built on top of the `Hyperview` (previously `Hyperview`) storage layer, which maintains the complete history of all deltas.
A lossy view initializes from a given lossless view.
The lossless view dispatches events when entity properties are updated.
```typescript
// Basic View implementation
export abstract class View<Accumulator, Result = Accumulator> {
// Core methods
abstract reducer(acc: Accumulator, cur: HyperviewOne): Accumulator;
resolver?(acc: Accumulator, entityIds: DomainEntityID[]): Result;
// Built-in functionality
resolve(entityIds?: DomainEntityID[]): Result | undefined;
}
```
View semantics are similar to map-reduce, resolvers in Redux, etc.
### Hyperview
`Hyperview` (previously `Hyperview`) maintains the complete, immutable history of all deltas and provides the foundation for building derived views.
```typescript
// Basic Hyperview usage
const hyperview = new Hyperview(rhizomeNode);
const view = new MyCustomView(hyperview);
const result = view.resolve(['entity1', 'entity2']);
```
## Creating Custom Resolvers
### Basic Resolver Pattern
1. **Define your accumulator type**: This holds the state of your view
2. **Implement the reducer**: Pure function that updates the accumulator
3. **Optionally implement resolver**: Transforms the accumulator into the final result
```typescript
class CountResolver extends View<number> {
reducer(acc: number, view: HyperviewOne): number {
return acc + Object.keys(view.propertyDeltas).length;
}
initializer() { return 0; }
}
```
### Custom Resolver with Dependencies
Resolvers can depend on other resolvers using the `CustomResolver` system:
```typescript
// Define resolver plugins
const resolver = new CustomResolver(hyperview, {
totalItems: new CountPlugin(),
average: new AveragePlugin()
});
// Get resolved values
const result = resolver.resolve(['entity1', 'entity2']);
```
## Built-in Resolvers
### Last Write Wins
Keeps the most recent value based on timestamp:
```typescript
const resolver = new CustomResolver(hyperview, {
status: new LastWriteWinsPlugin()
});
```
### First Write Wins
Keeps the first value that was written:
```typescript
const resolver = new CustomResolver(hyperview, {
initialStatus: new FirstWriteWinsPlugin()
});
```
### Aggregation Resolvers
Built-in aggregators for common operations:
- `SumPlugin`: Sum of numeric values
- `AveragePlugin`: Average of numeric values
- `CountPlugin`: Count of values
- `MinPlugin`: Minimum value
- `MaxPlugin`: Maximum value
## Advanced Topics
### Performance Considerations
1. **Efficient Reducers**: Keep reducers pure and fast
2. **Selective Updates**: Only process changed entities
3. **Memoization**: Cache expensive computations
### Common Patterns
1. **Derived Properties**: Calculate values based on other properties
2. **State Machines**: Track state transitions over time
3. **Validation**: Enforce data consistency rules
## Best Practices
1. **Keep Reducers Pure**: Avoid side effects in reducers
2. **Use Strong Typing**: Leverage TypeScript for type safety
3. **Handle Edge Cases**: Consider empty states and error conditions
4. **Profile Performance**: Monitor and optimize hot paths
## Examples
### Simple Counter
```typescript
class CounterView extends View<number> {
reducer(acc: number, view: HyperviewOne): number {
return acc + view.propertyDeltas['increment']?.length || 0;
}
initializer() { return 0; }
}
```
### Running Average
```typescript
class RunningAverageView extends View<{sum: number, count: number}, number> {
reducer(acc: {sum: number, count: number}, view: HyperviewOne): {sum: number, count: number} {
const value = // extract value from view
return {
sum: acc.sum + value,
count: acc.count + 1
};
}
resolver(acc: {sum: number, count: number}): number {
return acc.count > 0 ? acc.sum / acc.count : 0;
}
initializer() { return {sum: 0, count: 0}; }
}
```
The key is to identify your accumulator object.
Your algorithm SHOULD be implemented so that the reducer is a pure function.
All state must therefore be stored in the accumulator.

View File

@ -56,7 +56,7 @@ const unsafeDelta = createDelta('peer1', 'peer1')
.buildV1();
// This will be ingested without validation
node.hyperview.ingestDelta(unsafeDelta);
node.lossless.ingestDelta(unsafeDelta);
// 5. Check validation status after the fact
const stats = collection.getValidationStats();

View File

@ -84,9 +84,9 @@ const delta = createTestDelta('user1', 'host1')
## How It Works
1. Creates a new `Hyperview` instance for the test
1. Creates a new `Lossless` instance for the test
2. Sets up a `CustomResolver` with the provided plugins
3. Ingests all provided deltas into the `Hyperview` instance
3. Ingests all provided deltas into the `Lossless` instance
4. Retrieves a view for the specified entity
5. Processes the view through the resolver
6. Calls the `expectedResult` callback with the resolved entity
@ -97,4 +97,4 @@ const delta = createTestDelta('user1', 'host1')
- The helper handles all setup and teardown of test resources
- Use `createTestDelta` for consistent delta creation in tests
- The helper ensures type safety between the resolver and the expected result type
- Each test gets a fresh `Hyperview` instance automatically
- Each test gets a fresh `Lossless` instance automatically

View File

@ -18,8 +18,8 @@ type User = {
(async () => {
const rhizomeNode = new RhizomeNode();
// Enable API to read hyperview
rhizomeNode.httpServer.httpApi.serveHyperview();
// Enable API to read lossless view
rhizomeNode.httpServer.httpApi.serveLossless();
const users = new BasicCollection("user");
users.rhizomeConnect(rhizomeNode);

View File

@ -136,10 +136,10 @@ smalltalk type messaging structure on top of the database
note dimensions of attack surface
layers:
primitives - what's a delta, schema, materialized view, view
primitives - what's a delta, schema, materialized view, lossy view
delta store - allows you to persiste deltas and query over the delta stream
materialized view store - view snapshot(s)
view bindings - e.g. graphql, define what a user looks like, that gets application bindings
materialized view store - lossy snapshot(s)
lossy bindings - e.g. graphql, define what a user looks like, that gets application bindings
e.g. where your resolvers come in, so that your fields aren't all arrays, i.e. you resolve conflicts
-- idea: diff tools

View File

@ -34,10 +34,10 @@ Pre-built technologies? LevelDB
Can use LevelDB to store deltas
Structure can correspond to our desired ontology
Layers for
- primitives - what's a delta, schema, materialized view, view
- primitives - what's a delta, schema, materialized view, lossy view
- delta store - allows you to persiste deltas and query over the delta stream
- materialized view store - view snapshot(s)
- view bindings - e.g. graphql, define what a user looks like, that gets application bindings
- materialized view store - lossy snapshot(s)
- lossy bindings - e.g. graphql, define what a user looks like, that gets application bindings
e.g. where your resolvers come in, so that your fields aren't all arrays, i.e. you resolve conflicts
Protocol involving ZeroMQ--

View File

@ -1,11 +1,11 @@
> myk:
> I think so far this seems mostly on point, but I'd focus on building the bridge between Domain Entity (view representation) <-> Hyperview Representation <-> Delta[] I think
> I think so far this seems mostly on point, but I'd focus on building the bridge between Domain Entity (lossy representation) <-> Lossless Representation <-> Delta[] I think
> the tricky stuff comes in with, like, how do you take an undifferentiated stream of deltas, a query and a schema
> and filter / merge those deltas into the hyperview tree structure you need in order to then reduce into a view domain node
> and filter / merge those deltas into the lossless tree structure you need in order to then reduce into a lossy domain node
> if that part of the flow works then the rest becomes fairly self-evident
> a "hyperview representation" is basically a DAG/Tree that starts with a root node whose properties each contain the deltas that assign values to them, where the delta may have a pointer up to "this" and then down to some related domain node, which gets interpolated into the tree instead of just referenced, and it has its properties contain the deltas that target it, etc
> a "lossless representation" is basically a DAG/Tree that starts with a root node whose properties each contain the deltas that assign values to them, where the delta may have a pointer up to "this" and then down to some related domain node, which gets interpolated into the tree instead of just referenced, and it has its properties contain the deltas that target it, etc
> so you need both the ID of the root node (the thing being targeted by one or more deltas) as well as the scehma to apply to determine which contexts on that target to include (target_context effectively becomes a property on the domain entity, right?), as well as which schema to apply to included referenced entities, etc.
> so it's what keeps you from returning the whole stream of deltas, while still allowing you to follow arbitrary edges
@ -31,7 +31,7 @@ Example delta:
target: "usd"
}]
Hyperview transformation:
Lossless transformation:
{
keanu: {

View File

@ -9,7 +9,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
1. **Deltas are relationships**: Every delta with pointers already expresses relationships
2. **Patterns, not structure**: We're recognizing patterns in how deltas connect entities
3. **Perspective-driven**: Different views/resolvers can interpret the same deltas differently
4. **No single truth**: Competing deltas are resolved by application-level view resolvers
4. **No single truth**: Competing deltas are resolved by application-level lossy resolvers
5. **Time-aware**: All queries are inherently temporal, showing different relationships at different times
## Current State ✅
@ -17,7 +17,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
- **All tests passing**: 21/21 suites, 183/183 tests (100%)
- **Delta system**: Fully functional with pointers expressing relationships
- **Negation system**: Can invalidate deltas (and thus relationships)
- **Query system**: Basic traversal of hyperviews
- **Query system**: Basic traversal of lossless views
- **Schema system**: Can describe entity structures
- **Resolver system**: Application-level interpretation of deltas
@ -120,7 +120,7 @@ Phase 4 recognizes that in Rhizome, **deltas ARE relationships**. Instead of add
class PatternResolver {
// Interpret deltas matching certain patterns
resolveWithPatterns(entityId, patterns) {
const deltas = this.hyperview.getDeltasForEntity(entityId);
const deltas = this.lossless.getDeltasForEntity(entityId);
return {
entity: entityId,

View File

@ -2,7 +2,7 @@
## Executive Summary
The rhizome-node implementation demonstrates strong alignment with core spec concepts but lacks implementation and testing for several advanced features. The fundamental delta → hyperview → view transformation pipeline is well-implemented, while query systems, relational features, and advanced conflict resolution remain unimplemented.
The rhizome-node implementation demonstrates strong alignment with core spec concepts but lacks implementation and testing for several advanced features. The fundamental delta → lossless → lossy transformation pipeline is well-implemented, while query systems, relational features, and advanced conflict resolution remain unimplemented.
## Core Concept Alignment
@ -13,13 +13,13 @@ The rhizome-node implementation demonstrates strong alignment with core spec con
- **Implementation**: Correctly implements both V1 (array) and V2 (object) formats
- **Tests**: Basic format conversion tested, but validation gaps exist
2. **Hyperview Views**
2. **Lossless Views**
- **Spec**: Full inventory of all deltas composing an object
- **Implementation**: `HyperviewDomain` correctly accumulates deltas by entity/property
- **Implementation**: `LosslessViewDomain` correctly accumulates deltas by entity/property
- **Tests**: Good coverage of basic transformation, filtering by creator/host
3. **Lossy Views**
- **Spec**: Compression of hyperviews using resolution strategies
- **Spec**: Compression of lossless views using resolution strategies
- **Implementation**: Initializer/reducer/resolver pattern provides flexibility
- **Tests**: Domain-specific example (Role/Actor/Film) demonstrates concept
@ -127,4 +127,4 @@ The rhizome-node implementation demonstrates strong alignment with core spec con
## Conclusion
The rhizome-node implementation successfully captures the core concepts of the spec but requires significant work to achieve full compliance. The foundation is solid, with the delta/hyperview/view pipeline working as designed. However, advanced features like queries, schemas, and sophisticated conflict resolution remain unimplemented. The test suite would benefit from expanded coverage of edge cases, validation, and distributed system scenarios.
The rhizome-node implementation successfully captures the core concepts of the spec but requires significant work to achieve full compliance. The foundation is solid, with the delta/lossless/lossy pipeline working as designed. However, advanced features like queries, schemas, and sophisticated conflict resolution remain unimplemented. The test suite would benefit from expanded coverage of edge cases, validation, and distributed system scenarios.

14
spec.md
View File

@ -8,11 +8,11 @@
* a `primitive` is a literal string, number or boolean value whose meaning is not tied up in its being a reference to a larger whole.
* An `object` is a composite object whose entire existence is encoded as the set of deltas that reference it. An object is identified by a unique `reference`, and every delta that includes that `reference` is asserting a claim about some property of that object.
* A `negation` is a specific kind of delta that includes a pointer with the name `negates`, a `target` reference to another delta, and a `context` called `negated_by`.
* a `schema` represents a template by which an `object` can be compiled into a `hyperview`. A schema specifies which properties of that object are included, and it specifies schemas for the objects references by the deltas within those properties. A schema must terminate in primitive schemas to avoid an infinite regress.
* For instance, a `hyperview` "User" of a user may include references to friends. If those friends are in turn encoded as instances of the "User" schema then all of *their* friends would be fully encoded, etc.
* a `schema` represents a template by which an `object` can be compiled into a `lossless view`. A schema specifies which properties of that object are included, and it specifies schemas for the objects references by the deltas within those properties. A schema must terminate in primitive schemas to avoid an infinite regress.
* For instance, a `lossless view` "User" of a user may include references to friends. If those friends are in turn encoded as instances of the "User" schema then all of *their* friends would be fully encoded, etc.
* This could lead to circular references and arbitrarily deep nesting, which runs into the problem of "returning the entire graph". So our schema should specify, for instance, that the "friends" field apply the "Summary" schema to referenced users rather than the "User" schema, where the "Summary" schema simply resolves to username and photo.
* A `hyperview` is a representation of an `object` that includes a full inventory of all of the deltas that compose that object. So for instance, a hyperview of the object representing the user "Alice" might include `alice.name`, which contains an array of all deltas with a pointer whose `target` is the ID of Alice and whose context is `name`. Such deltas would likely include a second pointer with the name `name` and the target a primitive string "Alice", for instance.
* A `hyperview` may also include nested delta/object layering. Consider `alice.friends`, which would include all deltas asserting friendship between Alice and some other person. Each such delta would reference a different friend object. In a hyperview, these references would be expanded to contain hyperviews of those friends. Schemas, as defined above, would be applied to constrain tree depth and avoid infinite regress.
* A `view` is a compression of a `hyperview` that removes delta information and flattens the structure into a standard domain object, typically in JSON. So instead of `alice.name` resolving to a list of deltas that assert the object's name it might simply resolve to `"alice"`.
* Note that in a hyperview any property of an object necessarily resolves to a set of deltas, even if it's an empty set, because we cannot anticipate how many deltas exist that assert values on that context.
* In collapsing a hyperview into a view we may specify `resolution strategies` on each field of the schema. A resolution strategy takes as input the set of all deltas targeting that context and returns as output the value in the view. So if we have 15 deltas asserting the value for an object's name, our resolution strategy may simply say "return the target of the `name` pointer associated with the most recent delta", or it may say "return an array of names". If the value is numeric it may say "take the max" or "take the min" or "take the average".
* A `lossless view` is a representation of an `object` that includes a full inventory of all of the deltas that compose that object. So for instance, a lossless view of the object representing the user "Alice" might include `alice.name`, which contains an array of all deltas with a pointer whose `target` is the ID of Alice and whose context is `name`. Such deltas would likely include a second pointer with the name `name` and the target a primitive string "Alice", for instance.
* A `lossless view` may also include nested delta/object layering. Consider `alice.friends`, which would include all deltas asserting friendship between Alice and some other person. Each such delta would reference a different friend object. In a lossless view, these references would be expanded to contain lossless views of those friends. Schemas, as defined above, would be applied to constrain tree depth and avoid infinite regress.
* A `lossy view` is a compression of a `lossless view` that removes delta information and flattens the structure into a standard domain object, typically in JSON. So instead of `alice.name` resolving to a list of deltas that assert the object's name it might simply resolve to `"alice"`.
* Note that in a lossless view any property of an object necessarily resolves to a set of deltas, even if it's an empty set, because we cannot anticipate how many deltas exist that assert values on that context.
* In collapsing a lossless view into a lossy view we may specify `resolution strategies` on each field of the schema. A resolution strategy takes as input the set of all deltas targeting that context and returns as output the value in the lossy view. So if we have 15 deltas asserting the value for an object's name, our resolution strategy may simply say "return the target of the `name` pointer associated with the most recent delta", or it may say "return an array of names". If the value is numeric it may say "take the max" or "take the min" or "take the average".

View File

@ -18,7 +18,7 @@ export abstract class Collection<View> {
rhizomeNode?: RhizomeNode;
name: string;
eventStream = new EventEmitter();
view?: View;
lossy?: View;
constructor(name: string) {
this.name = name;
@ -34,7 +34,7 @@ export abstract class Collection<View> {
this.initializeView();
// Listen for completed transactions, and emit updates to event stream
this.rhizomeNode.hyperview.eventStream.on("updated", (id) => {
this.rhizomeNode.lossless.eventStream.on("updated", (id) => {
// TODO: Filter so we only get members of our collection
// TODO: Reslover / Delta Filter?
@ -128,7 +128,7 @@ export abstract class Collection<View> {
}
debug(`Getting ids for collection ${this.name}`)
const ids = new Set<string>();
for (const [entityId, names] of this.rhizomeNode.hyperview.referencedAs.entries()) {
for (const [entityId, names] of this.rhizomeNode.lossless.referencedAs.entries()) {
if (names.has(this.name)) {
ids.add(entityId);
}
@ -160,7 +160,7 @@ export abstract class Collection<View> {
);
const ingested = new Promise<boolean>((resolve) => {
this.rhizomeNode!.hyperview.eventStream.on("updated", (id: DomainEntityID) => {
this.rhizomeNode!.lossless.eventStream.on("updated", (id: DomainEntityID) => {
if (id === entityId) resolve(true);
})
});
@ -178,7 +178,7 @@ export abstract class Collection<View> {
await this.rhizomeNode!.deltaStream.publishDelta(delta);
// ingest the delta as though we had received it from a peer
this.rhizomeNode!.hyperview.ingestDelta(delta);
this.rhizomeNode!.lossless.ingestDelta(delta);
});
// Return updated view of this entity

View File

@ -7,20 +7,20 @@ import {Collection} from '../collections/collection-abstract';
import {TimestampResolver} from '../views/resolvers/timestamp-resolvers';
export class BasicCollection extends Collection<TimestampResolver> {
declare view?: TimestampResolver;
declare lossy?: TimestampResolver;
initializeView() {
if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.view = new TimestampResolver(this.rhizomeNode.hyperview);
this.lossy = new TimestampResolver(this.rhizomeNode.lossless);
}
resolve(
id: string
) {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.view) throw new Error('view not initialized');
if (!this.lossy) throw new Error('lossy view not initialized');
const res = this.view.resolve([id]) || {};
const res = this.lossy.resolve([id]) || {};
return res[id];
}

View File

@ -6,20 +6,20 @@ class RelationalView extends TimestampResolver {
}
export class RelationalCollection extends Collection<RelationalView> {
declare view?: RelationalView;
declare lossy?: RelationalView;
initializeView() {
if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.view = new RelationalView(this.rhizomeNode.hyperview);
this.lossy = new RelationalView(this.rhizomeNode.lossless);
}
resolve(
id: string
): ResolvedViewOne | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.view) throw new Error('view not initialized');
if (!this.lossy) throw new Error('lossy view not initialized');
const res = this.view.resolve([id]) || {};
const res = this.lossy.resolve([id]) || {};
return res[id];
}

View File

@ -10,7 +10,7 @@ import {
SchemaApplicationOptions
} from '../schema/schema';
import { DefaultSchemaRegistry } from '../schema/schema-registry';
import { HyperviewOne } from '../views/hyperview';
import { LosslessViewOne } from '../views/lossless';
import { DomainEntityID } from '../core/types';
import { EntityProperties } from '../core/entity';
import { createDelta } from '@src/core';
@ -58,38 +58,38 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
initializeView(): void {
if (!this.rhizomeNode) throw new Error('not connected to rhizome');
this.view = new TimestampResolver(this.rhizomeNode.hyperview);
this.lossy = new TimestampResolver(this.rhizomeNode.lossless);
}
resolve(id: string): ResolvedViewOne | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
if (!this.view) throw new Error('view not initialized');
if (!this.lossy) throw new Error('lossy view not initialized');
const res = this.view.resolve([id]) || {};
const res = this.lossy.resolve([id]) || {};
return res[id];
}
// Validate an entity against the schema
validate(entity: T): SchemaValidationResult {
// Convert entity to a mock hyperview for validation
const mockHyperview: HyperviewOne = {
// Convert entity to a mock lossless view for validation
const mockLosslessView: LosslessViewOne = {
id: 'validation-mock',
referencedAs: [],
propertyDeltas: {},
};
for (const [key, value] of Object.entries(entity)) {
mockHyperview.propertyDeltas[key] = [createDelta('validation', 'validation')
mockLosslessView.propertyDeltas[key] = [createDelta('validation', 'validation')
.addPointer(key, value as string)
.buildV1(),
];
}
return this.schemaRegistry.validate('validation-mock', this.schema.id, mockHyperview);
return this.schemaRegistry.validate('validation-mock', this.schema.id, mockLosslessView);
}
// Apply schema to a hyperview
apply(view: HyperviewOne): SchemaAppliedView {
// Apply schema to a lossless view
apply(view: LosslessViewOne): SchemaAppliedView {
return this.schemaRegistry.applySchema(view, this.schema.id, this.applicationOptions);
}
@ -97,10 +97,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
getValidatedView(entityId: DomainEntityID): SchemaAppliedView | undefined {
if (!this.rhizomeNode) throw new Error('collection not connected to rhizome');
const hyperview = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!hyperview) return undefined;
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId];
if (!losslessView) return undefined;
return this.apply(hyperview);
return this.apply(losslessView);
}
// Get all entities in this collection with schema validation
@ -169,10 +169,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
for (const entityId of entityIds) {
if (!this.rhizomeNode) continue;
const hyperview = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!hyperview) continue;
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId];
if (!losslessView) continue;
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperview);
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView);
if (validationResult.valid) {
stats.validEntities++;
@ -200,16 +200,16 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
debug(`No rhizome node connected`)
return [];
}
const hyperview = this.rhizomeNode.hyperview.compose(this.getIds());
if (!hyperview) {
debug(`No hyperview found`)
const losslessView = this.rhizomeNode.lossless.compose(this.getIds());
if (!losslessView) {
debug(`No lossless view found`)
return [];
}
debug(`getValidEntities, hyperview: ${JSON.stringify(hyperview, null, 2)}`)
debug(`getValidEntities, losslessView: ${JSON.stringify(losslessView, null, 2)}`)
debug(`Validating ${this.getIds().length} entities`)
return this.getIds().filter(entityId => {
debug(`Validating entity ${entityId}`)
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperview[entityId]);
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView[entityId]);
debug(`Validation result for entity ${entityId}: ${JSON.stringify(validationResult)}`)
return validationResult.valid;
});
@ -221,10 +221,10 @@ export class TypedCollectionImpl<T extends Record<string, unknown>>
const invalid: Array<{ entityId: DomainEntityID; errors: string[] }> = [];
for (const entityId of this.getIds()) {
const hyperview = this.rhizomeNode.hyperview.compose([entityId])[entityId];
if (!hyperview) continue;
const losslessView = this.rhizomeNode.lossless.compose([entityId])[entityId];
if (!losslessView) continue;
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, hyperview);
const validationResult = this.schemaRegistry.validate(entityId, this.schema.id, losslessView);
if (!validationResult.valid) {
invalid.push({
entityId,

24
src/core/context.ts Normal file
View File

@ -0,0 +1,24 @@
// A delta represents an assertion from a given (perspective/POV/context).
// So we want it to be fluent to express these in the local context,
// and propagated as deltas in a configurable manner; i.e. configurable batches or immediate
// import {Delta} from '../core/types';
export class Entity {
}
export class Context {
}
export class Assertion {
}
export class View {
}
export class User {
}
export function example() {
}

View File

@ -1,4 +1,5 @@
export * from './delta';
export * from './delta-builder';
export * from './types';
export * from './context';
export { Entity } from './entity';

View File

@ -1,7 +1,7 @@
import Debug from "debug";
import EventEmitter from "events";
import {Delta, DeltaID} from "../core/delta";
import {Hyperview} from "../views/hyperview";
import {Lossless} from "../views/lossless";
import {DomainEntityID, TransactionID} from "../core/types";
const debug = Debug('rz:transactions');
@ -72,7 +72,7 @@ export class Transactions {
transactions = new Map<TransactionID, Transaction>();
eventStream = new EventEmitter();
constructor(readonly hyperview: Hyperview) {}
constructor(readonly lossless: Lossless) {}
get(id: TransactionID): Transaction | undefined {
return this.transactions.get(id);
@ -116,7 +116,7 @@ export class Transactions {
{
const {transactionId, size} = getTransactionSize(delta) || {};
if (transactionId && size) {
debug(`[${this.hyperview.rhizomeNode.config.peerId}]`, `Transaction ${transactionId} has size ${size}`);
debug(`[${this.lossless.rhizomeNode.config.peerId}]`, `Transaction ${transactionId} has size ${size}`);
this.setSize(transactionId, size as number);

View File

@ -71,8 +71,8 @@ export class HttpApi {
res.json(this.rhizomeNode.peers.peers.length);
});
// Initialize hyperview and query endpoints
this.serveHyperview();
// Initialize lossless and query endpoints
this.serveLossless();
this.serveQuery();
}
@ -116,17 +116,17 @@ export class HttpApi {
});
}
serveHyperview() {
serveLossless() {
// Get all domain entity IDs. TODO: This won't scale
this.router.get('/hyperview/ids', (_req: express.Request, res: express.Response) => {
this.router.get('/lossless/ids', (_req: express.Request, res: express.Response) => {
res.json({
ids: Array.from(this.rhizomeNode.hyperview.domainEntities.keys())
ids: Array.from(this.rhizomeNode.lossless.domainEntities.keys())
});
});
// Get all transaction IDs. TODO: This won't scale
this.router.get('/transaction/ids', (_req: express.Request, res: express.Response) => {
const set = this.rhizomeNode.hyperview.referencedAs.get("_transaction");
const set = this.rhizomeNode.lossless.referencedAs.get("_transaction");
res.json({
ids: set ? Array.from(set.values()) : []
});
@ -135,7 +135,7 @@ export class HttpApi {
// View a single transaction
this.router.get('/transaction/:id', (req: express.Request, res: express.Response) => {
const {params: {id}} = req;
const v = this.rhizomeNode.hyperview.compose([id]);
const v = this.rhizomeNode.lossless.compose([id]);
const ent = v[id];
if (!ent.referencedAs?.includes("_transaction")) {
res.status(400).json({error: "Entity is not a transaction", id});
@ -144,19 +144,19 @@ export class HttpApi {
res.json({
...ent,
isComplete: this.rhizomeNode.hyperview.transactions.isComplete(id)
isComplete: this.rhizomeNode.lossless.transactions.isComplete(id)
});
});
// Get a hyperview of a single domain entity
this.router.get('/hyperview/:id', (req: express.Request, res: express.Response) => {
// Get a lossless view of a single domain entity
this.router.get('/lossless/:id', (req: express.Request, res: express.Response) => {
const {params: {id}} = req;
const v = this.rhizomeNode.hyperview.compose([id]);
const v = this.rhizomeNode.lossless.compose([id]);
const ent = v[id];
res.json({
...ent,
isComplete: this.rhizomeNode.hyperview.transactions.isComplete(id)
isComplete: this.rhizomeNode.lossless.transactions.isComplete(id)
});
});
}

View File

@ -2,7 +2,7 @@ import Debug from 'debug';
import {CREATOR, HTTP_API_ADDR, HTTP_API_ENABLE, HTTP_API_PORT, PEER_ID, PUBLISH_BIND_ADDR, PUBLISH_BIND_HOST, PUBLISH_BIND_PORT, REQUEST_BIND_ADDR, REQUEST_BIND_HOST, REQUEST_BIND_PORT, SEED_PEERS, STORAGE_TYPE, STORAGE_PATH} from './config';
import {DeltaStream, parseAddressList, PeerAddress, Peers, PubSub, RequestReply} from './network';
import {HttpServer} from './http/index';
import {Hyperview} from './views';
import {Lossless} from './views';
import {QueryEngine, StorageQueryEngine} from './query';
import {DefaultSchemaRegistry} from './schema';
import {DeltaQueryStorage, StorageFactory, StorageConfig} from './storage';
@ -29,7 +29,7 @@ export class RhizomeNode {
requestReply: RequestReply;
httpServer: HttpServer;
deltaStream: DeltaStream;
hyperview: Hyperview;
lossless: Lossless;
peers: Peers;
queryEngine: QueryEngine;
storageQueryEngine: StorageQueryEngine;
@ -70,14 +70,14 @@ export class RhizomeNode {
this.httpServer = new HttpServer(this);
this.deltaStream = new DeltaStream(this);
this.peers = new Peers(this);
this.hyperview = new Hyperview(this);
this.lossless = new Lossless(this);
this.schemaRegistry = new DefaultSchemaRegistry();
// Initialize storage backend
this.deltaStorage = StorageFactory.create(this.config.storage!);
// Initialize query engines (both hyperview-based and storage-based)
this.queryEngine = new QueryEngine(this.hyperview, this.schemaRegistry);
// Initialize query engines (both lossless-based and storage-based)
this.queryEngine = new QueryEngine(this.lossless, this.schemaRegistry);
this.storageQueryEngine = new StorageQueryEngine(this.deltaStorage, this.schemaRegistry);
}
@ -91,10 +91,10 @@ export class RhizomeNode {
}: { syncOnStart?: boolean } = {}): Promise<void> {
debug(`[${this.config.peerId}]`, 'Starting node (waiting for ready)...');
// Connect our hyperview to the delta stream
// Connect our lossless view to the delta stream
this.deltaStream.subscribeDeltas(async (delta) => {
// Ingest into hyperview
this.hyperview.ingestDelta(delta);
// Ingest into lossless view
this.lossless.ingestDelta(delta);
// Also store in persistent storage
try {
@ -150,11 +150,11 @@ export class RhizomeNode {
}
/**
* Sync existing hyperview data to persistent storage
* Sync existing lossless view data to persistent storage
* Useful for migrating from memory-only to persistent storage
*/
async syncToStorage(): Promise<void> {
debug(`[${this.config.peerId}]`, 'Syncing hyperview to storage');
debug(`[${this.config.peerId}]`, 'Syncing lossless view to storage');
const allDeltas = this.deltaStream.deltasAccepted;
let synced = 0;

View File

@ -2,7 +2,7 @@ import jsonLogic from 'json-logic-js';
const { apply, is_logic } = jsonLogic;
import Debug from 'debug';
import { SchemaRegistry, SchemaID, ObjectSchema } from '../schema/schema';
import { Hyperview, HyperviewMany, HyperviewOne, valueFromDelta } from '../views/hyperview';
import { Lossless, LosslessViewMany, LosslessViewOne, valueFromDelta } from '../views/lossless';
import { DomainEntityID } from '../core/types';
import { Delta, DeltaFilter } from '../core/delta';
@ -31,14 +31,14 @@ export interface QueryOptions {
}
export interface QueryResult {
entities: HyperviewMany;
entities: LosslessViewMany;
totalFound: number;
limited: boolean;
}
export class QueryEngine {
constructor(
private hyperview: Hyperview,
private lossless: Lossless,
private schemaRegistry: SchemaRegistry
) {}
@ -96,12 +96,12 @@ export class QueryEngine {
const candidateEntityIds = this.discoverEntitiesBySchema(schemaId);
debug(`Found ${candidateEntityIds.length} candidate entities for schema ${schemaId}`);
// 2. Compose hyperviews for all candidates
const allViews = this.hyperview.compose(candidateEntityIds, options.deltaFilter);
debug(`Composed ${Object.keys(allViews).length} hyperviews`);
// 2. Compose lossless views for all candidates
const allViews = this.lossless.compose(candidateEntityIds, options.deltaFilter);
debug(`Composed ${Object.keys(allViews).length} lossless views`);
// 3. Apply JSON Logic filter if provided
let filteredViews: HyperviewMany = allViews;
let filteredViews: LosslessViewMany = allViews;
if (filter) {
filteredViews = this.applyJsonLogicFilter(allViews, filter, schemaId);
@ -132,10 +132,10 @@ export class QueryEngine {
/**
* Query for a single entity by ID with schema validation
*/
async queryOne(schemaId: SchemaID, entityId: DomainEntityID): Promise<HyperviewOne | null> {
async queryOne(schemaId: SchemaID, entityId: DomainEntityID): Promise<LosslessViewOne | null> {
debug(`Querying single entity ${entityId} with schema ${schemaId}`);
const views = this.hyperview.compose([entityId]);
const views = this.lossless.compose([entityId]);
const view = views[entityId];
if (!view) {
@ -165,7 +165,7 @@ export class QueryEngine {
// Strategy: Find entities that have deltas for the schema's required properties
const requiredProperties = schema.requiredProperties || [];
const allEntityIds = Array.from(this.hyperview.domainEntities.keys());
const allEntityIds = Array.from(this.lossless.domainEntities.keys());
if (requiredProperties.length === 0) {
// No required properties - return all entities
@ -177,7 +177,7 @@ export class QueryEngine {
const candidateEntities: DomainEntityID[] = [];
for (const entityId of allEntityIds) {
const entity = this.hyperview.domainEntities.get(entityId);
const entity = this.lossless.domainEntities.get(entityId);
if (!entity) continue;
// Check if entity has deltas for all required property
@ -195,28 +195,28 @@ export class QueryEngine {
}
/**
* Apply JSON Logic filter to hyperviews
* This requires converting each hyperview to a queryable object
* Apply JSON Logic filter to lossless views
* This requires converting each lossless view to a queryable object
*/
private applyJsonLogicFilter(
views: HyperviewMany,
views: LosslessViewMany,
filter: JsonLogic,
schemaId: SchemaID
): HyperviewMany {
): LosslessViewMany {
const schema = this.schemaRegistry.get(schemaId);
if (!schema) {
debug(`Cannot filter without schema ${schemaId}`);
return views;
}
const filteredViews: HyperviewMany = {};
const filteredViews: LosslessViewMany = {};
let hasFilterErrors = false;
const filterErrors: string[] = [];
for (const [entityId, view] of Object.entries(views)) {
try {
// Convert hyperview to queryable object using schema
const queryableObject = this.hyperviewToQueryableObject(view, schema);
// Convert lossless view to queryable object using schema
const queryableObject = this.losslessViewToQueryableObject(view, schema);
// Apply JSON Logic filter
const matches = apply(filter, queryableObject);
@ -246,16 +246,16 @@ export class QueryEngine {
}
/**
* Convert a hyperview to a queryable object based on schema
* Convert a lossless view to a queryable object based on schema
* Uses simple resolution strategies for now
*/
private hyperviewToQueryableObject(view: HyperviewOne, schema: ObjectSchema): Record<string, unknown> {
private losslessViewToQueryableObject(view: LosslessViewOne, schema: ObjectSchema): Record<string, unknown> {
const obj: Record<string, unknown> = {
id: view.id,
_referencedAs: view.referencedAs
};
// Convert each schema property from hyperview deltas
// Convert each schema property from lossless view deltas
for (const [propertyId, propertySchema] of Object.entries(schema.properties)) {
const deltas = view.propertyDeltas[propertyId] || [];
@ -315,7 +315,7 @@ export class QueryEngine {
/**
* Check if an entity matches a schema (basic validation)
*/
private entityMatchesSchema(view: HyperviewOne, schemaId: SchemaID): boolean {
private entityMatchesSchema(view: LosslessViewOne, schemaId: SchemaID): boolean {
const schema = this.schemaRegistry.get(schemaId);
if (!schema) return false;
@ -337,7 +337,7 @@ export class QueryEngine {
* Get statistics about queryable entities
*/
getStats() {
const totalEntities = this.hyperview.domainEntities.size;
const totalEntities = this.lossless.domainEntities.size;
const registeredSchemas = this.schemaRegistry.list().length;
return {

View File

@ -14,9 +14,9 @@ import {
SchemaApplicationOptions,
ResolutionContext
} from '../schema/schema';
import { Hyperview, HyperviewOne } from '../views/hyperview';
import { Lossless, LosslessViewOne } from '../views/lossless';
import { DomainEntityID, PropertyID, PropertyTypes } from '../core/types';
import { CollapsedDelta } from '../views/hyperview';
import { CollapsedDelta } from '../views/lossless';
import { Delta } from '@src/core';
const debug = Debug('rz:schema-registry');
@ -103,7 +103,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
}
}
validate(entityId: DomainEntityID, schemaId: SchemaID, view: HyperviewOne): SchemaValidationResult {
validate(entityId: DomainEntityID, schemaId: SchemaID, view: LosslessViewOne): SchemaValidationResult {
const schema = this.get(schemaId);
if (!schema) {
return {
@ -282,7 +282,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
}
applySchema(
view: HyperviewOne,
view: LosslessViewOne,
schemaId: SchemaID,
options: SchemaApplicationOptions = {}
): SchemaAppliedView {
@ -333,9 +333,9 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
* Resolves references to other entities according to schema specifications
*/
applySchemaWithNesting(
view: HyperviewOne,
view: LosslessViewOne,
schemaId: SchemaID,
hyperview: Hyperview,
losslessView: Lossless,
options: SchemaApplicationOptions = {}
): SchemaAppliedViewWithNesting {
const { maxDepth = 3, includeMetadata = true, strictValidation = false } = options;
@ -344,16 +344,16 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
return this.resolveNestedView(
view,
schemaId,
hyperview,
losslessView,
resolutionContext,
{ includeMetadata, strictValidation }
);
}
private resolveNestedView(
view: HyperviewOne,
view: LosslessViewOne,
schemaId: SchemaID,
hyperview: Hyperview,
losslessView: Lossless,
context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean }
): SchemaAppliedViewWithNesting {
@ -401,7 +401,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedViews = this.resolveReferenceProperty(
deltas,
referenceSchema,
hyperview,
losslessView,
context.withDepth(context.currentDepth + 1),
options,
view.id
@ -415,7 +415,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedViews = this.resolveReferenceProperty(
deltas,
referenceSchema,
hyperview,
losslessView,
context.withDepth(context.currentDepth + 1),
options,
view.id
@ -449,7 +449,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
private resolveReferenceProperty(
deltas: Delta[],
referenceSchema: ReferenceSchema,
hyperview: Hyperview,
losslessView: Lossless,
context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean },
parentEntityId: string
@ -469,7 +469,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
delta,
parentEntityId,
referenceSchema.targetSchema,
hyperview,
losslessView,
context,
options
);
@ -480,8 +480,8 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const referenceIds = this.extractReferenceIdsFromDelta(delta, parentEntityId);
for (const referenceId of referenceIds) {
try {
// Get the referenced entity's hyperview
const referencedViews = hyperview.compose([referenceId]);
// Get the referenced entity's lossless view
const referencedViews = losslessView.compose([referenceId]);
const referencedView = referencedViews[referenceId];
if (referencedView) {
@ -489,7 +489,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedView = this.resolveNestedView(
referencedView,
referenceSchema.targetSchema,
hyperview,
losslessView,
context,
options
);
@ -514,7 +514,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
delta: Delta,
parentEntityId: string,
targetSchema: SchemaID,
hyperview: Hyperview,
losslessView: Lossless,
context: ResolutionContext,
options: { includeMetadata: boolean; strictValidation: boolean }
): SchemaAppliedViewWithNesting | null {
@ -536,7 +536,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
// Count entity references vs scalars
if (typeof target === 'string') {
const referencedViews = hyperview.compose([target]);
const referencedViews = losslessView.compose([target]);
if (referencedViews[target]) {
entityReferenceCount++;
} else {
@ -568,7 +568,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
if (typeof target === 'string') {
// Try to resolve as entity reference
try {
const referencedViews = hyperview.compose([target]);
const referencedViews = losslessView.compose([target]);
const referencedView = referencedViews[target];
if (referencedView) {
@ -576,7 +576,7 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
const nestedView = this.resolveNestedView(
referencedView,
targetSchema,
hyperview,
losslessView,
context,
options
);
@ -601,14 +601,14 @@ export class DefaultSchemaRegistry implements SchemaRegistry {
if (typeof target === 'string') {
// Try to resolve as entity reference
try {
const referencedViews = hyperview.compose([target]);
const referencedViews = losslessView.compose([target]);
const referencedView = referencedViews[target];
if (referencedView) {
const nestedView = this.resolveNestedView(
referencedView,
targetSchema,
hyperview,
losslessView,
context,
options
);

View File

@ -1,6 +1,6 @@
import { DomainEntityID, PropertyID, PropertyTypes } from "../core/types";
import { HyperviewOne } from "../views/hyperview";
import { CollapsedDelta } from "../views/hyperview";
import { LosslessViewOne } from "../views/lossless";
import { CollapsedDelta } from "../views/lossless";
// Base schema types
export type SchemaID = string;
@ -52,7 +52,7 @@ export interface SchemaRegistry {
register(schema: ObjectSchema): void;
get(id: SchemaID): ObjectSchema | undefined;
list(): ObjectSchema[];
validate(entityId: DomainEntityID, schemaId: SchemaID, view: HyperviewOne): SchemaValidationResult;
validate(entityId: DomainEntityID, schemaId: SchemaID, view: LosslessViewOne): SchemaValidationResult;
}
// Validation result types
@ -76,7 +76,7 @@ export interface SchemaApplicationOptions {
strictValidation?: boolean;
}
// Applied schema result - a hyperview filtered through a schema
// Applied schema result - a lossless view filtered through a schema
export interface SchemaAppliedView {
id: DomainEntityID;
schemaId: SchemaID;
@ -105,7 +105,7 @@ export interface SchemaAppliedViewWithNesting extends SchemaAppliedView {
export interface TypedCollection<T> {
schema: ObjectSchema;
validate(entity: T): SchemaValidationResult;
apply(view: HyperviewOne): SchemaAppliedView;
apply(view: LosslessViewOne): SchemaAppliedView;
}
// Built-in schema helpers

View File

@ -1,3 +1,3 @@
export * from './hyperview';
export * from './view';
export * from './lossless';
export * from './lossy';
export * from './resolvers';

View File

@ -9,7 +9,7 @@ import {Transactions} from '../features/transactions';
import {DomainEntityID, PropertyID, PropertyTypes, TransactionID, ViewMany} from "../core/types";
import {Negation} from '../features/negation';
import {NegationHelper} from '../features/negation';
const debug = Debug('rz:hyperview');
const debug = Debug('rz:lossless');
export type CollapsedPointer = {[key: PropertyID]: PropertyTypes};
@ -49,7 +49,7 @@ export function valueFromDelta(
}
// TODO: Store property deltas as references to reduce memory footprint
export type HyperviewOne = {
export type LosslessViewOne = {
id: DomainEntityID,
referencedAs?: string[];
propertyDeltas: {
@ -57,21 +57,21 @@ export type HyperviewOne = {
}
}
export type CollapsedViewOne = Omit<HyperviewOne, 'propertyDeltas'> & {
export type CollapsedViewOne = Omit<LosslessViewOne, 'propertyDeltas'> & {
propertyCollapsedDeltas: {
[key: PropertyID]: CollapsedDelta[]
}
};
export type HyperviewMany = ViewMany<HyperviewOne>;
export type LosslessViewMany = ViewMany<LosslessViewOne>;
export type CollapsedViewMany = ViewMany<CollapsedViewOne>;
class HyperviewEntityMap extends Map<DomainEntityID, HyperviewEntity> {};
class LosslessEntityMap extends Map<DomainEntityID, LosslessEntity> {};
class HyperviewEntity {
class LosslessEntity {
properties = new Map<PropertyID, Set<Delta>>();
constructor(readonly hyperview: Hyperview, readonly id: DomainEntityID) {}
constructor(readonly lossless: Lossless, readonly id: DomainEntityID) {}
addDelta(delta: Delta | DeltaV2) {
// Convert DeltaV2 to DeltaV1 if needed
@ -93,7 +93,7 @@ class HyperviewEntity {
propertyDeltas.add(delta);
}
debug(`[${this.hyperview.rhizomeNode.config.peerId}]`, `entity ${this.id} added delta:`, JSON.stringify(delta));
debug(`[${this.lossless.rhizomeNode.config.peerId}]`, `entity ${this.id} added delta:`, JSON.stringify(delta));
}
toJSON() {
@ -103,14 +103,14 @@ class HyperviewEntity {
}
return {
id: this.id,
referencedAs: Array.from(this.hyperview.referencedAs.get(this.id) ?? []),
referencedAs: Array.from(this.lossless.referencedAs.get(this.id) ?? []),
properties
};
}
}
export class Hyperview {
domainEntities = new HyperviewEntityMap();
export class Lossless {
domainEntities = new LosslessEntityMap();
transactions: Transactions;
eventStream = new EventEmitter();
@ -160,7 +160,7 @@ export class Hyperview {
for (const entityId of affectedEntities) {
let ent = this.domainEntities.get(entityId);
if (!ent) {
ent = new HyperviewEntity(this, entityId);
ent = new LosslessEntity(this, entityId);
this.domainEntities.set(entityId, ent);
}
// Add negation delta to the entity
@ -189,7 +189,7 @@ export class Hyperview {
let ent = this.domainEntities.get(target);
if (!ent) {
ent = new HyperviewEntity(this, target);
ent = new LosslessEntity(this, target);
this.domainEntities.set(target, ent);
}
@ -208,7 +208,7 @@ export class Hyperview {
return transactionId;
}
decompose(view: HyperviewOne): Delta[] {
decompose(view: LosslessViewOne): Delta[] {
const allDeltas: Delta[] = [];
const seenDeltaIds = new Set<DeltaID>();
@ -225,8 +225,8 @@ export class Hyperview {
return allDeltas;
}
compose(entityIds?: DomainEntityID[], deltaFilter?: DeltaFilter): HyperviewMany {
const view: HyperviewMany = {};
compose(entityIds?: DomainEntityID[], deltaFilter?: DeltaFilter): LosslessViewMany {
const view: LosslessViewMany = {};
entityIds = entityIds ?? Array.from(this.domainEntities.keys());
for (const entityId of entityIds) {
@ -391,4 +391,4 @@ export class Hyperview {
}
// TODO: point-in-time queries
}
}

View File

@ -1,12 +1,12 @@
// We have the hyperview transformation of the delta stream.
// We want to enable transformations from the hyperview,
// into various possible "view" views that combine or exclude some information.
// We have the lossless transformation of the delta stream.
// We want to enable transformations from the lossless view,
// into various possible "lossy" views that combine or exclude some information.
import Debug from 'debug';
import {Delta, DeltaFilter, DeltaID} from "../core/delta";
import {Hyperview, HyperviewOne} from "./hyperview";
import {Lossless, LosslessViewOne} from "./lossless";
import {DomainEntityID, PropertyID, PropertyTypes, ViewMany} from "../core/types";
const debug = Debug('rz:view');
const debug = Debug('rz:lossy');
type PropertyMap = Record<PropertyID, PropertyTypes>;
@ -17,20 +17,20 @@ export type LossyViewOne<T = PropertyMap> = {
export type LossyViewMany<T = PropertyMap> = ViewMany<LossyViewOne<T>>;
// We support incremental updates of view models.
// We support incremental updates of lossy models.
export abstract class Lossy<Accumulator, Result = Accumulator> {
deltaFilter?: DeltaFilter;
private accumulator?: Accumulator;
initializer?(): Accumulator;
abstract reducer(acc: Accumulator, cur: HyperviewOne): Accumulator;
abstract reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator;
resolver?(acc: Accumulator, entityIds: DomainEntityID[]): Result;
constructor(
readonly hyperview: Hyperview,
readonly lossless: Lossless,
) {
this.hyperview.eventStream.on("updated", (id, deltaIds) => {
debug(`[${this.hyperview.rhizomeNode.config.peerId}] entity ${id} updated, deltaIds:`,
this.lossless.eventStream.on("updated", (id, deltaIds) => {
debug(`[${this.lossless.rhizomeNode.config.peerId}] entity ${id} updated, deltaIds:`,
JSON.stringify(deltaIds));
this.ingestUpdate(id, deltaIds);
@ -46,16 +46,16 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
if (!this.deltaFilter) return true;
return this.deltaFilter(delta);
};
const hyperviewPartial = this.hyperview.compose([entityId], combinedFilter);
const losslessPartial = this.lossless.compose([entityId], combinedFilter);
if (!hyperviewPartial) {
// This should not happen; this should only be called after the hyperview has been updated
console.error(`Hyperview for entity ${entityId} not found`);
if (!losslessPartial) {
// This should not happen; this should only be called after the lossless view has been updated
console.error(`Lossless view for entity ${entityId} not found`);
return;
}
const latest = this.accumulator || this.initializer?.() || {} as Accumulator;
this.accumulator = this.reducer(latest, hyperviewPartial[entityId]);
this.accumulator = this.reducer(latest, losslessPartial[entityId]);
}
// Resolve the current state of the view
@ -65,7 +65,7 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
}
if (!entityIds) {
entityIds = Array.from(this.hyperview.domainEntities.keys());
entityIds = Array.from(this.lossless.domainEntities.keys());
}
if (!this.resolver) {
@ -76,5 +76,3 @@ export abstract class Lossy<Accumulator, Result = Accumulator> {
}
}
// "Lossy" can simply be called "View"
export abstract class View<Accumulator, Result> extends Lossy<Accumulator, Result> {};

View File

@ -1,7 +1,7 @@
import { Hyperview, HyperviewOne } from "../hyperview";
import { Lossy } from '../view';
import { Lossless, LosslessViewOne } from "../lossless";
import { Lossy } from '../lossy';
import { DomainEntityID, PropertyID, ViewMany } from "../../core/types";
import { valueFromDelta } from "../hyperview";
import { valueFromDelta } from "../lossless";
import { EntityRecord, EntityRecordMany } from "@src/core/entity";
export type AggregationType = 'min' | 'max' | 'sum' | 'average' | 'count';
@ -53,13 +53,13 @@ function aggregateValues(values: number[], type: AggregationType): number {
export class AggregationResolver extends Lossy<Accumulator, Result> {
constructor(
hyperview: Hyperview,
lossless: Lossless,
private config: AggregationConfig
) {
super(hyperview);
super(lossless);
}
reducer(acc: Accumulator, cur: HyperviewOne): Accumulator {
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator {
if (!acc[cur.id]) {
acc[cur.id] = { id: cur.id, properties: {} };
}
@ -111,41 +111,41 @@ export class AggregationResolver extends Lossy<Accumulator, Result> {
// Convenience classes for common aggregation types
export class MinResolver extends AggregationResolver {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
constructor(lossless: Lossless, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'min');
super(hyperview, config);
super(lossless, config);
}
}
export class MaxResolver extends AggregationResolver {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
constructor(lossless: Lossless, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'max');
super(hyperview, config);
super(lossless, config);
}
}
export class SumResolver extends AggregationResolver {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
constructor(lossless: Lossless, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'sum');
super(hyperview, config);
super(lossless, config);
}
}
export class AverageResolver extends AggregationResolver {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
constructor(lossless: Lossless, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'average');
super(hyperview, config);
super(lossless, config);
}
}
export class CountResolver extends AggregationResolver {
constructor(hyperview: Hyperview, properties: PropertyID[]) {
constructor(lossless: Lossless, properties: PropertyID[]) {
const config: AggregationConfig = {};
properties.forEach(prop => config[prop] = 'count');
super(hyperview, config);
super(lossless, config);
}
}

View File

@ -1,5 +1,5 @@
import { PropertyID, PropertyTypes } from "../../../core/types";
import { CollapsedDelta } from "../../hyperview";
import { CollapsedDelta } from "../../lossless";
import Debug from 'debug';
const debug = Debug('rz:custom-resolver:plugin');

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../hyperview";
import { CollapsedDelta } from "../../../../views/lossless";
import { ResolverPlugin } from "../plugin";
import Debug from 'debug';
const debug = Debug('rz:concatenation-plugin');

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../hyperview";
import { CollapsedDelta } from "../../../lossless";
import { ResolverPlugin } from "../plugin";
type FirstWriteWinsState = {

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "../../../../core/types";
import { CollapsedDelta } from "../../../hyperview";
import { CollapsedDelta } from "../../../lossless";
import { ResolverPlugin } from "../plugin";
type LastWriteWinsState = {

View File

@ -1,5 +1,5 @@
import { PropertyTypes } from "@src/core/types";
import { CollapsedDelta } from "@src/views/hyperview";
import { CollapsedDelta } from "@src/views/lossless";
import { ResolverPlugin, DependencyStates } from "../plugin";
type RunningAverageState = {

View File

@ -1,5 +1,5 @@
import { Hyperview, HyperviewOne } from "../../hyperview";
import { Lossy } from '../../view';
import { Lossless, LosslessViewOne } from "../../lossless";
import { Lossy } from '../../lossy';
import { DomainEntityID, PropertyID, PropertyTypes } from "../../../core/types";
import { ResolverPlugin, DependencyStates } from "./plugin";
import { EntityRecord } from "@src/core/entity";
@ -46,14 +46,14 @@ export class CustomResolver extends Lossy<Accumulator, Result> {
/**
* Creates a new CustomResolver instance
* @param hyperview - The Hyperview instance to use for delta tracking
* @param lossless - The Lossless instance to use for delta tracking
* @param config - A mapping of property IDs to their resolver plugins
*/
constructor(
hyperview: Hyperview,
lossless: Lossless,
config: PluginMap
) {
super(hyperview);
super(lossless);
this.config = config;
this.buildDependencyGraph();
this.executionOrder = this.calculateExecutionOrder();
@ -244,7 +244,7 @@ export class CustomResolver extends Lossy<Accumulator, Result> {
/**
* Update the state with new deltas from the view
*/
reducer(acc: Accumulator, {id: entityId, propertyDeltas}: HyperviewOne): Accumulator {
reducer(acc: Accumulator, {id: entityId, propertyDeltas}: LosslessViewOne): Accumulator {
debug(`Processing deltas for entity: ${entityId}`);
debug('Property deltas:', JSON.stringify(propertyDeltas));

View File

@ -1,6 +1,6 @@
import { EntityProperties } from "../../core/entity";
import { Hyperview, CollapsedDelta, valueFromDelta, HyperviewOne } from "../hyperview";
import { Lossy } from '../view';
import { Lossless, CollapsedDelta, valueFromDelta, LosslessViewOne } from "../lossless";
import { Lossy } from '../lossy';
import { DomainEntityID, PropertyID, PropertyTypes, Timestamp, ViewMany } from "../../core/types";
import Debug from 'debug';
@ -77,13 +77,13 @@ function compareWithTieBreaking(
export class TimestampResolver extends Lossy<Accumulator, Result> {
constructor(
hyperview: Hyperview,
lossless: Lossless,
private tieBreakingStrategy: TieBreakingStrategy = 'delta-id'
) {
super(hyperview);
super(lossless);
}
reducer(acc: Accumulator, cur: HyperviewOne): Accumulator {
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator {
if (!acc[cur.id]) {
acc[cur.id] = { id: cur.id, properties: {} };
}
@ -138,26 +138,26 @@ export class TimestampResolver extends Lossy<Accumulator, Result> {
// Convenience classes for different tie-breaking strategies
export class CreatorIdTimestampResolver extends TimestampResolver {
constructor(hyperview: Hyperview) {
super(hyperview, 'creator-id');
constructor(lossless: Lossless) {
super(lossless, 'creator-id');
}
}
export class DeltaIdTimestampResolver extends TimestampResolver {
constructor(hyperview: Hyperview) {
super(hyperview, 'delta-id');
constructor(lossless: Lossless) {
super(lossless, 'delta-id');
}
}
export class HostIdTimestampResolver extends TimestampResolver {
constructor(hyperview: Hyperview) {
super(hyperview, 'host-id');
constructor(lossless: Lossless) {
super(lossless, 'host-id');
}
}
export class LexicographicTimestampResolver extends TimestampResolver {
constructor(hyperview: Hyperview) {
super(hyperview, 'lexicographic');
constructor(lossless: Lossless) {
super(lossless, 'lexicographic');
}
}

View File

@ -7,7 +7,7 @@
- **Delta Lifecycle**:
- Creation via `DeltaBuilder`
- Propagation through `DeltaStream`
- Storage in `Hyperview` view
- Storage in `Lossless` view
- Transformation in `Lossy` views
### 2. Network Layer
@ -22,7 +22,7 @@
### 3. Storage
- **In-Memory Storage**:
- `Hyperview` view maintains complete delta history
- `Lossless` view maintains complete delta history
- `Lossy` views provide optimized access patterns
- **Persistence**:
- LevelDB integration
@ -49,8 +49,8 @@
- `request-reply.ts`: Direct node communication
3. **Views**:
- `hyperview.ts`: Complete delta history
- `view.ts`: Derived, optimized views
- `lossless.ts`: Complete delta history
- `lossy.ts`: Derived, optimized views
4. **Schema**:
- `schema.ts`: Type definitions

12
todo.md
View File

@ -12,7 +12,7 @@ This document tracks work needed to achieve full specification compliance, organ
- [x] Add validation for pointer consistency
### 1.2 Complete Transaction Support ✅ (mostly)
- [x] Implement transaction-based filtering in hyperviews
- [x] Implement transaction-based filtering in lossless views
- [x] Add transaction grouping in delta streams
- [x] Test atomic transaction operations
- [ ] Add transaction rollback capabilities (deferred - not critical for spec parity)
@ -29,8 +29,8 @@ This document tracks work needed to achieve full specification compliance, organ
### 2.1 Negation Deltas ✅
- [x] Implement negation delta type with "negates" pointer
- [x] Add "negated_by" context handling
- [x] Update hyperview to handle negations
- [x] Update view resolvers to respect negations
- [x] Update lossless view to handle negations
- [x] Update lossy resolvers to respect negations
- [x] Add comprehensive negation tests
### 2.2 Advanced Conflict Resolution ✅
@ -50,7 +50,7 @@ This document tracks work needed to achieve full specification compliance, organ
### 3.1 Query Engine Foundation ✅
- [x] Implement JSON Logic parser (using json-logic-js)
- [x] Create query planner for hyperviews
- [x] Create query planner for lossless views
- [x] Add query execution engine (QueryEngine class)
- [x] Implement schema-driven entity discovery
- [x] Enable the skipped query tests
@ -91,7 +91,7 @@ This document tracks work needed to achieve full specification compliance, organ
### 4.4 Schema-as-Deltas (Meta-Schema System)
- [ ] Define schema entities that are stored as deltas in the system
- [ ] Implement schema queries that return schema instances from hyperviews
- [ ] Implement schema queries that return schema instances from lossless views
- [ ] Create schema evolution through delta mutations
- [ ] Add temporal schema queries (schema time-travel)
- [ ] Build schema conflict resolution for competing schema definitions
@ -186,7 +186,7 @@ This document tracks work needed to achieve full specification compliance, organ
1. **Phase 1** ✅ - These are foundational requirements
2. **Phase 2.1 (Negation)** ✅ - Core spec feature that affects all views
3. **Phase 2.2 (Resolvers)** ✅ - Needed for proper views
3. **Phase 2.2 (Resolvers)** ✅ - Needed for proper lossy views
4. **Phase 2.3 (Nesting)** ✅ - Depends on schemas and queries
5. **Phase 3 (Query)** ✅ - Unlocks powerful data access
6. **Phase 4 (Relational)** - Builds on query system