Compare commits

..

15 Commits

Author SHA1 Message Date
63026d069e Update README.md 2025-06-15 14:33:29 -05:00
1b9ed0a6dd Merge pull request 'pr-1-feedback' (#2) from pr-1-feedback into main
Reviewed-on: #2
2025-06-15 14:26:31 -05:00
6c13f46714
fix(negation): improve negation chain handling and clean up logging
- Refactored isDeltaNegated to properly handle complex negation chains
- Added recursive negation detection to account for negations of negations
- Replaced console.log with debug statements in tests
- Improved test coverage and documentation
- Fixed TypeScript type issues
2025-06-15 14:15:49 -05:00
4b750c593d
feat(query): validate JSON Logic operators and improve error handling
- Add validation for JSON Logic operators in query filters
- Throw InvalidQueryOperatorError for invalid operators
- Improve error handling and logging in applyJsonLogicFilter
- Update tests to verify new error handling behavior
- Fix TypeScript linting issues and improve code style
2025-06-15 14:15:49 -05:00
51d336b88b
feat: handle undefined values in custom resolvers
- Update ResolverPlugin interface to allow resolve() to return undefined
- Modify CustomResolver to skip properties with undefined values
- Add test case for resolvers that return undefined
- Update existing tests to handle cases where no properties are resolved

This change makes the behavior more explicit when resolvers don't return a value,
which is particularly important for numeric aggregations where 0 might be a valid result.
2025-06-15 14:15:48 -05:00
d5cc436bcb
chore: remove unused _RhizomeImports from test files
- Removed unused _RhizomeImports from:
  - __tests__/compose-decompose.ts
  - __tests__/transactions.ts
  - __tests__/negation.ts
- All tests continue to pass after removal
2025-06-15 14:15:48 -05:00
35bbc974d8
refactor: move common resolve logic to base Lossy class
- Moved the resolve method implementation from individual resolvers to the base Lossy class
- Updated initializer methods to accept a LosslessViewOne parameter
- Removed redundant resolve methods from LastWriteWins, TimestampResolver, CustomResolver, and AggregationResolver
- Ensured consistent behavior across all resolver implementations
- All tests passing with the refactored code
2025-06-15 14:15:48 -05:00
dd8987563a
Fix recursion depth test validity
Previosly a failure to correctly limit the depth might not be detected
2025-06-15 14:15:47 -05:00
885630f4d9
refactor: move CommonSchemas to test-utils
- Moved CommonSchemas from src/schema/schema.ts to src/test-utils/schemas.ts
- Updated all test files to import CommonSchemas from the new location
- Fixed the Document schema to match test expectations by making the author field required
- Added additional fields to the Document schema to match the original implementation
- Ensured all tests pass with the new implementation

Addresses PR feedback that CommonSchemas is only used in tests and should be moved to test files.
2025-06-15 14:15:47 -05:00
08fb5778ba
fix: update comment about LastWriteWins tie-breaking algorithm
Addresses PR feedback about the outdated comment in concurrent-writes.ts. The comment now accurately reflects that the resolution uses the LastWriteWins resolver's tie-breaking algorithm rather than delta processing order.
2025-06-15 14:15:46 -05:00
91514289ae
Entity conforms to schema if it has all required properties, not just some 2025-06-15 14:15:46 -05:00
ada27ff1b0
removed data/ and test-data/ from repo 2025-06-15 14:15:46 -05:00
04212e87e6 Merge pull request 'major changes and feature additions' (#1) from claude_code_work into main
Reviewed-on: #1
2025-06-15 14:08:40 -05:00
cd83f026cd
add .env to .gitignore 2025-06-14 17:03:10 -05:00
98014aaa17
add script for generating test coverage report 2025-06-14 17:01:33 -05:00
49 changed files with 665 additions and 282 deletions

3
.gitignore vendored
View File

@ -3,3 +3,6 @@ node_modules/
coverage/ coverage/
*.swp *.swp
*.swo *.swo
.env
data/
test-data/

View File

@ -1,23 +1,5 @@
See [spec.md](spec.md) for additional specification details about this project. See [spec.md](spec.md) for additional specification details about this project.
# Concepts
| | Implemented | Notes |
| ------------- | ----------- | ------------------------------------------------------------------------ |
| Peering | Yes | Implemented with ZeroMQ and/or Libp2p. Libp2p solves more problems. |
| Schemas | Not really | Currently very thin layer allowing TypedCollections |
| Relationships | No | Supporting relational algebra among domain entities |
| Views | Yes | Lossless: Map the `targetContext`s as properties of domain entities. |
| | | Lossy: Use a delta filter and a resolver function to produce a view. |
| | | Currently using functions rather than JSON-Logic expressions. |
| Functions | No | Arbitrary subscribers to delta stream (that can also emit deltas?) |
| Tests | Yes | We are set up to run unit tests and multi-node tests |
| Identity | Sort of | We have an identity service via Libp2p |
| Contexts | No | Each context may involve different lossy functions and delta filters |
| HTTP API | Yes | Basic peering info and entity CRUD |
If we express views and filter rules as JSON-Logic, we can easily include them in records.
# Development / Demo # Development / Demo
## Setup ## Setup
@ -156,7 +138,23 @@ EOF
curl -s -X PUT -H 'content-type:application/json' -d @/tmp/user.json http://localhost:3000/api/user | jq curl -s -X PUT -H 'content-type:application/json' -d @/tmp/user.json http://localhost:3000/api/user | jq
``` ```
# More About Concepts # Concepts
| | Implemented | Notes |
| ------------- | ----------- | ------------------------------------------------------------------------ |
| Peering | Yes | Implemented with ZeroMQ and/or Libp2p. Libp2p solves more problems. |
| Schemas | Not really | Currently very thin layer allowing TypedCollections |
| Relationships | No | Supporting relational algebra among domain entities |
| Views | Yes | Lossless: Map the `targetContext`s as properties of domain entities. |
| | | Lossy: Use a delta filter and a resolver function to produce a view. |
| | | Currently using functions rather than JSON-Logic expressions. |
| Functions | No | Arbitrary subscribers to delta stream (that can also emit deltas?) |
| Tests | Yes | We are set up to run unit tests and multi-node tests |
| Identity | Sort of | We have an identity service via Libp2p |
| Contexts | No | Each context may involve different lossy functions and delta filters |
| HTTP API | Yes | Basic peering info and entity CRUD |
If we express views and filter rules as JSON-Logic, we can easily include them in records.
## Clocks? ## Clocks?

View File

@ -1,4 +1,3 @@
import * as _RhizomeImports from "../src";
/** /**
* Tests for lossless view compose() and decompose() bidirectional conversion * Tests for lossless view compose() and decompose() bidirectional conversion
* Ensures that deltas can be composed into lossless views and decomposed back * Ensures that deltas can be composed into lossless views and decomposed back

View File

@ -58,7 +58,7 @@ describe('Concurrent Write Scenarios', () => {
const result = resolver.resolve(); const result = resolver.resolve();
expect(result).toBeDefined(); expect(result).toBeDefined();
// Should resolve deterministically (likely based on delta processing order) // Should resolve deterministically using the LastWriteWins resolver's tie-breaking algorithm
expect(typeof result!['entity1'].properties.score).toBe('number'); expect(typeof result!['entity1'].properties.score).toBe('number');
expect([100, 200]).toContain(result!['entity1'].properties.score); expect([100, 200]).toContain(result!['entity1'].properties.score);
}); });

View File

@ -670,7 +670,11 @@ describe('Custom Resolvers', () => {
const result = resolver.resolve(); const result = resolver.resolve();
expect(result).toBeDefined(); expect(result).toBeDefined();
expect(result!['entity1'].properties.score).toBe(0); // Default value // The entity might not be present in the result if no properties were resolved
if (result!['entity1']) {
expect(result!['entity1'].properties).toBeDefined();
expect(result!['entity1'].properties).not.toHaveProperty('score');
}
}); });
}); });
}); });

View File

@ -1,9 +1,11 @@
import * as _RhizomeImports from "../src"; import Debug from 'debug';
import { Delta } from '../src/core'; import { Delta } from '../src/core';
import { NegationHelper } from '../src/features'; import { NegationHelper } from '../src/features';
import { RhizomeNode } from '../src/node'; import { RhizomeNode } from '../src/node';
import { Lossless } from '../src/views'; import { Lossless } from '../src/views';
const debug = Debug('rz:negation:test');
describe('Negation System', () => { describe('Negation System', () => {
let node: RhizomeNode; let node: RhizomeNode;
let lossless: Lossless; let lossless: Lossless;
@ -443,32 +445,6 @@ describe('Negation System', () => {
expect(stats.negationDeltas).toBe(0); // No negations for this entity expect(stats.negationDeltas).toBe(0); // No negations for this entity
}); });
it('should handle multiple negations and un-negations', () => {
const originalDelta = new Delta({
creator: 'user1',
host: 'host1',
pointers: [
{ localContext: 'visible', target: 'item1', targetContext: 'visible' },
{ localContext: 'value', target: true }
]
});
const negation1 = NegationHelper.createNegation(originalDelta.id, 'mod1', 'host1');
const negation2 = NegationHelper.createNegation(originalDelta.id, 'mod2', 'host1');
lossless.ingestDelta(originalDelta);
lossless.ingestDelta(negation1);
lossless.ingestDelta(negation2);
// Delta should be thoroughly negated
const view = lossless.view(['item1']);
expect(view.item1).toBeUndefined();
const stats = lossless.getNegationStats('item1');
expect(stats.negatedDeltas).toBe(1);
expect(stats.negationDeltas).toBe(2);
});
it('should handle self-referential entities in negations', () => { it('should handle self-referential entities in negations', () => {
// Create a delta that references itself // Create a delta that references itself
const selfRefDelta = new Delta({ const selfRefDelta = new Delta({
@ -488,5 +464,165 @@ describe('Negation System', () => {
const view = lossless.view(['node1']); const view = lossless.view(['node1']);
expect(view.node1).toBeUndefined(); // Should be negated expect(view.node1).toBeUndefined(); // Should be negated
}); });
it('should handle multiple direct negations of the same delta', () => {
const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode);
// Create the original delta
const originalDelta = new Delta({
creator: 'user1',
host: 'host1',
pointers: [
{ localContext: 'title', target: 'entity2', targetContext: 'title' },
{ localContext: 'status', target: 'Draft' }
]
});
// Create two negations of the same delta
const negation1 = NegationHelper.createNegation(originalDelta.id, 'user2', 'host1');
const negation2 = NegationHelper.createNegation(originalDelta.id, 'user3', 'host1');
// Process all deltas
testLossless.ingestDelta(originalDelta);
testLossless.ingestDelta(negation1);
testLossless.ingestDelta(negation2);
// Get the view after processing all deltas
const view = testLossless.view(['entity2']);
// The original delta should be negated (not in view) because it has two direct negations
expect(view.entity2).toBeUndefined();
// Verify the stats
const stats = testLossless.getNegationStats('entity2');
expect(stats.negationDeltas).toBe(2);
expect(stats.negatedDeltas).toBe(1);
expect(stats.effectiveDeltas).toBe(0);
});
it('should handle complex negation chains', () => {
const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode);
// Create the original delta
const deltaA = new Delta({
creator: 'user1',
host: 'host1',
pointers: [
{ localContext: 'content', target: 'entity3', targetContext: 'content' },
{ localContext: 'text', target: 'Hello World' }
]
});
// Create a chain of negations: B negates A, C negates B, D negates C
const deltaB = NegationHelper.createNegation(deltaA.id, 'user2', 'host1');
const deltaC = NegationHelper.createNegation(deltaB.id, 'user3', 'host1');
const deltaD = NegationHelper.createNegation(deltaC.id, 'user4', 'host1');
debug('Delta A (original): %s', deltaA.id);
debug('Delta B (negates A): %s', deltaB.id);
debug('Delta C (negates B): %s', deltaC.id);
debug('Delta D (negates C): %s', deltaD.id);
// Process all deltas in order
testLossless.ingestDelta(deltaA);
testLossless.ingestDelta(deltaB);
testLossless.ingestDelta(deltaC);
testLossless.ingestDelta(deltaD);
// Get the view after processing all deltas
const view = testLossless.view(['entity3']);
// The original delta should be negated because:
// - B negates A
// - C negates B (so A is no longer negated)
// - D negates C (so B is no longer negated, and A is negated again by B)
expect(view.entity3).toBeUndefined();
// Get all deltas for the entity
const allDeltas = [deltaA, deltaB, deltaC, deltaD];
// Get the stats
const stats = testLossless.getNegationStats('entity3');
const isANegated = NegationHelper.isDeltaNegated(deltaA.id, allDeltas);
const isBNegated = NegationHelper.isDeltaNegated(deltaB.id, allDeltas);
const isCNegated = NegationHelper.isDeltaNegated(deltaC.id, allDeltas);
const isDNegated = NegationHelper.isDeltaNegated(deltaD.id, allDeltas);
debug('Delta statuses:');
debug('- A (%s): %s', deltaA.id, isANegated ? 'NEGATED' : 'ACTIVE');
debug('- B (%s): %s, negates: %s', deltaB.id, isBNegated ? 'NEGATED' : 'ACTIVE', NegationHelper.getNegatedDeltaId(deltaB));
debug('- C (%s): %s, negates: %s', deltaC.id, isCNegated ? 'NEGATED' : 'ACTIVE', NegationHelper.getNegatedDeltaId(deltaC));
debug('- D (%s): %s, negates: %s', deltaD.id, isDNegated ? 'NEGATED' : 'ACTIVE', NegationHelper.getNegatedDeltaId(deltaD));
debug('Negation stats: %O', {
totalDeltas: stats.totalDeltas,
negationDeltas: stats.negationDeltas,
negatedDeltas: stats.negatedDeltas,
effectiveDeltas: stats.effectiveDeltas,
negationsByProperty: stats.negationsByProperty
});
// B, C, D are negation deltas
expect(stats.negationDeltas).toBe(3);
// A and C are effectively negated
expect(isANegated).toBe(true);
expect(isCNegated).toBe(true);
// B and D are not negated (they are negation deltas that are not themselves negated)
expect(isBNegated).toBe(false);
expect(isDNegated).toBe(false);
// No deltas remain unnegated
expect(stats.effectiveDeltas).toBe(0);
});
it('should handle multiple independent negations', () => {
const testNode = new RhizomeNode();
const testLossless = new Lossless(testNode);
// Create two independent deltas
const delta1 = new Delta({
creator: 'user1',
host: 'host1',
pointers: [
{ localContext: 'item', target: 'entity4', targetContext: 'item' },
{ localContext: 'name', target: 'Item 1' }
]
});
const delta2 = new Delta({
creator: 'user2',
host: 'host1',
pointers: [
{ localContext: 'item', target: 'entity4', targetContext: 'item' },
{ localContext: 'name', target: 'Item 2' }
]
});
// Create negations for both deltas
const negation1 = NegationHelper.createNegation(delta1.id, 'user3', 'host1');
const negation2 = NegationHelper.createNegation(delta2.id, 'user4', 'host1');
// Process all deltas
testLossless.ingestDelta(delta1);
testLossless.ingestDelta(delta2);
testLossless.ingestDelta(negation1);
testLossless.ingestDelta(negation2);
// Get the view after processing all deltas
const view = testLossless.view(['entity4']);
// Both deltas should be negated
expect(view.entity4).toBeUndefined();
// Verify the stats
const stats = testLossless.getNegationStats('entity4');
expect(stats.negationDeltas).toBe(2);
expect(stats.negatedDeltas).toBe(2);
expect(stats.effectiveDeltas).toBe(0);
});
}); });
}); });

View File

@ -251,7 +251,7 @@ describe('Nested Object Resolution Performance', () => {
while (currentView.nestedObjects.next && currentView.nestedObjects.next.length > 0) { while (currentView.nestedObjects.next && currentView.nestedObjects.next.length > 0) {
currentView = currentView.nestedObjects.next[0]; currentView = currentView.nestedObjects.next[0];
depth++; depth++;
if (depth >= 5) break; // Prevent infinite loop if (depth >= 10) break; // Prevent infinite loop
} }
expect(depth).toBeLessThanOrEqual(5); expect(depth).toBeLessThanOrEqual(5);

View File

@ -12,7 +12,8 @@
import { RhizomeNode } from '../src/node'; import { RhizomeNode } from '../src/node';
import { Delta } from '../src/core'; import { Delta } from '../src/core';
import { DefaultSchemaRegistry } from '../src/schema'; import { DefaultSchemaRegistry } from '../src/schema';
import { CommonSchemas, SchemaBuilder, PrimitiveSchemas, ReferenceSchemas } from '../src/schema'; import { SchemaBuilder, PrimitiveSchemas, ReferenceSchemas } from '../src/schema';
import { CommonSchemas } from '../src/test-utils/schemas';
import { TypedCollectionImpl } from '../src/collections'; import { TypedCollectionImpl } from '../src/collections';
describe('Nested Object Resolution', () => { describe('Nested Object Resolution', () => {

View File

@ -1,7 +1,8 @@
import { QueryEngine } from '../src/query'; import { QueryEngine } from '../src/query';
import { Lossless } from '../src/views'; import { Lossless } from '../src/views';
import { DefaultSchemaRegistry } from '../src/schema'; import { DefaultSchemaRegistry } from '../src/schema';
import { CommonSchemas, SchemaBuilder, PrimitiveSchemas } from '../src/schema'; import { SchemaBuilder, PrimitiveSchemas } from '../src/schema';
import { CommonSchemas } from '../src/test-utils/schemas';
import { Delta } from '../src/core'; import { Delta } from '../src/core';
import { RhizomeNode } from '../src/node'; import { RhizomeNode } from '../src/node';
@ -330,16 +331,28 @@ describe('Query Engine', () => {
expect(Object.keys(result.entities)).toHaveLength(0); expect(Object.keys(result.entities)).toHaveLength(0);
}); });
it('handles malformed JSON Logic expressions', async () => { it('rejects invalid JSON Logic operators', async () => {
await createUser('user1', 'Alice', 25); await createUser('user1', 'Alice', 25);
const result = await queryEngine.query('user', { // Should throw an error for invalid operator
await expect(
queryEngine.query('user', {
'invalid-operator': [{ 'var': 'age' }, 25] 'invalid-operator': [{ 'var': 'age' }, 25]
})
).rejects.toThrow('Invalid query operator: invalid-operator');
}); });
// Should not crash, may return empty results or skip problematic entities it('handles valid JSON Logic expressions with runtime errors', async () => {
await createUser('user1', 'Alice', 25);
// This is a valid operator but will cause a runtime error due to type mismatch
const result = await queryEngine.query('user', {
'>': [{ 'var': 'name' }, 25] // Can't compare string and number with >
});
// Should still return a result but log the error
expect(result).toBeDefined(); expect(result).toBeDefined();
expect(typeof result.totalFound).toBe('number'); expect(result.totalFound).toBe(0); // No matches due to the error
}); });
}); });
}); });

View File

@ -3,10 +3,11 @@ import {
PrimitiveSchemas, PrimitiveSchemas,
ReferenceSchemas, ReferenceSchemas,
ArraySchemas, ArraySchemas,
CommonSchemas, // CommonSchemas has been moved to ./test-utils/schemas
ObjectSchema ObjectSchema
} from '../src/schema'; } from '../src/schema';
import { DefaultSchemaRegistry } from '../src/schema'; import { DefaultSchemaRegistry } from '../src/schema';
import { CommonSchemas } from '../src/test-utils/schemas';
import { TypedCollectionImpl, SchemaValidationError } from '../src/collections'; import { TypedCollectionImpl, SchemaValidationError } from '../src/collections';
import { RhizomeNode } from '../src/node'; import { RhizomeNode } from '../src/node';
import { Delta } from '../src/core'; import { Delta } from '../src/core';

View File

@ -1,4 +1,3 @@
import * as _RhizomeImports from "../src";
import { Delta } from '../src/core'; import { Delta } from '../src/core';
import { Lossless } from '../src/views'; import { Lossless } from '../src/views';
import { RhizomeNode } from '../src/node'; import { RhizomeNode } from '../src/node';

View File

@ -1 +0,0 @@
MANIFEST-000030

Binary file not shown.

View File

@ -1 +0,0 @@
MANIFEST-000030

View File

Binary file not shown.

View File

@ -9,6 +9,7 @@
"lint": "eslint", "lint": "eslint",
"test": "node --experimental-vm-modules node_modules/.bin/jest", "test": "node --experimental-vm-modules node_modules/.bin/jest",
"coverage": "./scripts/coverage.sh", "coverage": "./scripts/coverage.sh",
"coverage-report": "npm run test -- --coverage --coverageDirectory=coverage",
"example-app": "node dist/examples/app.js" "example-app": "node dist/examples/app.js"
}, },
"jest": { "jest": {

View File

@ -81,38 +81,140 @@ export class NegationHelper {
/** /**
* Check if a delta is negated by any negation deltas * Check if a delta is negated by any negation deltas
* @param deltaId The ID of the delta to check
* @param deltas The list of all deltas to consider
* @returns True if the delta is effectively negated, false otherwise
*/ */
static isDeltaNegated(deltaId: DeltaID, deltas: Delta[]): boolean { static isDeltaNegated(deltaId: DeltaID, deltas: Delta[]): boolean {
return this.findNegationsFor(deltaId, deltas).length > 0; // Create a map of delta ID to its negation status
} const deltaStatus = new Map<DeltaID, boolean>();
// Create a map of delta ID to its negation deltas
const deltaToNegations = new Map<DeltaID, Delta[]>();
/** // First pass: collect all deltas and their negations
* Filter out negated deltas from a list
* Returns deltas that are not negated by any negation deltas in the list
*/
static filterNegatedDeltas(deltas: Delta[]): Delta[] {
const negatedDeltaIds = new Set<DeltaID>();
// First pass: collect all negated delta IDs
for (const delta of deltas) { for (const delta of deltas) {
if (this.isNegationDelta(delta)) { if (this.isNegationDelta(delta)) {
const negatedId = this.getNegatedDeltaId(delta); const negatedId = this.getNegatedDeltaId(delta);
if (negatedId) { if (negatedId) {
negatedDeltaIds.add(negatedId); if (!deltaToNegations.has(negatedId)) {
deltaToNegations.set(negatedId, []);
}
deltaToNegations.get(negatedId)!.push(delta);
} }
} }
} }
// Second pass: filter out negated deltas and negation deltas themselves // Function to determine if a delta is effectively negated
const isEffectivelyNegated = (currentDeltaId: DeltaID, visited: Set<DeltaID> = new Set()): boolean => {
// Avoid infinite recursion in case of cycles
if (visited.has(currentDeltaId)) {
return false; // If we've seen this delta before, assume it's not negated to break the cycle
}
// Check if we've already determined this delta's status
if (deltaStatus.has(currentDeltaId)) {
return deltaStatus.get(currentDeltaId)!;
}
// Get all negations targeting this delta
const negations = deltaToNegations.get(currentDeltaId) || [];
// If there are no negations, the delta is not negated
if (negations.length === 0) {
deltaStatus.set(currentDeltaId, false);
return false;
}
// Check each negation to see if it's effectively applied
// A negation is effective if it's not itself negated
for (const negation of negations) {
// If the negation delta is not itself negated, then the target is negated
if (!isEffectivelyNegated(negation.id, new Set([...visited, currentDeltaId]))) {
deltaStatus.set(currentDeltaId, true);
return true;
}
}
// If all negations are themselves negated, the delta is not negated
deltaStatus.set(currentDeltaId, false);
return false;
};
// Check if the target delta is negated
return isEffectivelyNegated(deltaId);
}
/**
* Filter out negated deltas from a list, handling both direct and indirect negations
* Returns deltas that are not effectively negated by any chain of negations
*/
static filterNegatedDeltas(deltas: Delta[]): Delta[] {
// Create a map of delta ID to its negation status
const deltaStatus = new Map<DeltaID, boolean>();
// Create a map of delta ID to its negation deltas
const deltaToNegations = new Map<DeltaID, NegationDelta[]>();
// First pass: collect all deltas and their negations
for (const delta of deltas) {
if (this.isNegationDelta(delta)) {
const negatedId = this.getNegatedDeltaId(delta);
if (negatedId) {
if (!deltaToNegations.has(negatedId)) {
deltaToNegations.set(negatedId, []);
}
deltaToNegations.get(negatedId)!.push(delta);
}
}
}
// Function to determine if a delta is effectively negated
const isEffectivelyNegated = (deltaId: DeltaID, visited: Set<DeltaID> = new Set()): boolean => {
// Avoid infinite recursion in case of cycles
if (visited.has(deltaId)) {
return false; // If we've seen this delta before, assume it's not negated to break the cycle
}
// Check if we've already determined this delta's status
if (deltaStatus.has(deltaId)) {
return deltaStatus.get(deltaId)!;
}
// Get all negations targeting this delta
const negations = deltaToNegations.get(deltaId) || [];
// If there are no negations, the delta is not negated
if (negations.length === 0) {
deltaStatus.set(deltaId, false);
return false;
}
// Check each negation to see if it's effectively applied
// A negation is effective if it's not itself negated
for (const negation of negations) {
// If the negation delta is not itself negated, then the target is negated
if (!isEffectivelyNegated(negation.id, new Set([...visited, deltaId]))) {
deltaStatus.set(deltaId, true);
return true;
}
}
// If all negations are themselves negated, the delta is not negated
deltaStatus.set(deltaId, false);
return false;
};
// Second pass: filter out effectively negated deltas and all negation deltas
return deltas.filter(delta => { return deltas.filter(delta => {
// Exclude negation deltas themselves (they're metadata) // Always exclude negation deltas (they're metadata)
if (this.isNegationDelta(delta)) { if (this.isNegationDelta(delta)) {
return false; return false;
} }
// Exclude deltas that have been negated // Check if this delta is effectively negated
if (negatedDeltaIds.has(delta.id)) { const isNegated = isEffectivelyNegated(delta.id);
debug(`Filtering out negated delta ${delta.id}`);
if (isNegated) {
debug(`Filtering out effectively negated delta ${delta.id}`);
return false; return false;
} }
@ -128,37 +230,158 @@ export class NegationHelper {
negationDeltas: number; negationDeltas: number;
negatedDeltas: number; negatedDeltas: number;
effectiveDeltas: number; effectiveDeltas: number;
negatedDeltaIds: DeltaID[]; negationsByProperty: { [key: string]: { negated: number; total: number } };
negationMap: Map<DeltaID, DeltaID[]>; // negated -> [negating deltas] negatedDeltaIds: string[];
negationMap: Map<DeltaID, DeltaID[]>;
} { } {
const negationDeltas = deltas.filter(d => this.isNegationDelta(d)); const negationDeltas = deltas.filter(d => this.isNegationDelta(d));
const negatedDeltaIds = new Set<DeltaID>();
const negationMap = new Map<DeltaID, DeltaID[]>(); const negationMap = new Map<DeltaID, DeltaID[]>();
const deltaById = new Map<DeltaID, Delta>();
const properties = new Set<string>();
const negatedDeltaIds = new Set<string>();
for (const negDelta of negationDeltas) { // Build maps and collect properties
const negatedId = this.getNegatedDeltaId(negDelta); for (const delta of deltas) {
deltaById.set(delta.id, delta);
// Collect all properties referenced in the delta
if (delta.pointers) {
for (const pointer of delta.pointers) {
if (pointer.targetContext) {
properties.add(pointer.targetContext);
}
}
}
if (this.isNegationDelta(delta)) {
const negatedId = this.getNegatedDeltaId(delta);
if (negatedId) { if (negatedId) {
negatedDeltaIds.add(negatedId);
if (!negationMap.has(negatedId)) { if (!negationMap.has(negatedId)) {
negationMap.set(negatedId, []); negationMap.set(negatedId, []);
} }
negationMap.get(negatedId)!.push(negDelta.id); negationMap.get(negatedId)!.push(delta.id);
}
} }
} }
const effectiveDeltas = deltas.length - negationDeltas.length - negatedDeltaIds.size; // Track which deltas are effectively negated
const deltaStatus = new Map<DeltaID, boolean>();
// Function to determine if a delta is effectively negated
const isEffectivelyNegated = (deltaId: DeltaID, visited: Set<DeltaID> = new Set()): boolean => {
// Avoid infinite recursion in case of cycles
if (visited.has(deltaId)) {
return false; // If we've seen this delta before, assume it's not negated to break the cycle
}
// Check if we've already determined this delta's status
if (deltaStatus.has(deltaId)) {
return deltaStatus.get(deltaId)!;
}
// Get all negations targeting this delta
const negations = negationMap.get(deltaId) || [];
// If there are no negations, the delta is not negated
if (negations.length === 0) {
deltaStatus.set(deltaId, false);
return false;
}
// Check each negation to see if it's effectively applied
// A negation is effective if it's not itself negated
for (const negationId of negations) {
// If the negation delta is not itself negated, then the target is negated
if (!isEffectivelyNegated(negationId, new Set([...visited, deltaId]))) {
deltaStatus.set(deltaId, true);
return true;
}
}
// If all negations are themselves negated, the delta is not negated
deltaStatus.set(deltaId, false);
return false;
};
// First pass: determine status of all deltas
for (const delta of deltas) {
isEffectivelyNegated(delta.id);
}
// Calculate statistics
let effectiveDeltas = 0;
const negationsByProperty: { [key: string]: { negated: number; total: number } } = {};
// Initialize property counters
for (const prop of properties) {
negationsByProperty[prop] = { negated: 0, total: 0 };
}
// Second pass: count negated and effective deltas
for (const delta of deltas) {
const isNegation = this.isNegationDelta(delta);
const isNegated = deltaStatus.get(delta.id) || false;
if (isNegated) {
// For non-negation deltas, add them to the negated set
if (!isNegation) {
negatedDeltaIds.add(delta.id);
} else {
// For negation deltas, add the delta they negate (if it's not a negation delta)
const negatedId = this.getNegatedDeltaId(delta);
if (negatedId) {
const negatedDelta = deltaById.get(negatedId);
if (negatedDelta && !this.isNegationDelta(negatedDelta)) {
negatedDeltaIds.add(negatedId);
}
}
}
}
if (!isNegation) {
if (isNegated) {
// Already counted in negatedDeltaIds
} else {
effectiveDeltas++;
}
}
}
// Update property-based statistics
for (const delta of deltas) {
const isNegated = deltaStatus.get(delta.id) || false;
if (delta.pointers) {
for (const pointer of delta.pointers) {
if (pointer.targetContext && negationsByProperty[pointer.targetContext] !== undefined) {
negationsByProperty[pointer.targetContext].total++;
if (isNegated) {
negationsByProperty[pointer.targetContext].negated++;
}
}
}
}
}
return { return {
totalDeltas: deltas.length, totalDeltas: deltas.length,
negationDeltas: negationDeltas.length, negationDeltas: negationDeltas.length,
negatedDeltas: negatedDeltaIds.size, negatedDeltas: negatedDeltaIds.size,
effectiveDeltas, effectiveDeltas,
negationsByProperty,
negatedDeltaIds: Array.from(negatedDeltaIds), negatedDeltaIds: Array.from(negatedDeltaIds),
negationMap negationMap
}; };
} }
/**
* Helper to check if a delta with the given ID is a negation delta
*/
private static isNegationDeltaById(deltaId: DeltaID, deltas: Delta[]): boolean {
const delta = deltas.find(d => d.id === deltaId);
return delta ? this.isNegationDelta(delta) : false;
}
/** /**
* Apply negations to a delta stream in chronological order * Apply negations to a delta stream in chronological order
* Later negations can override earlier ones * Later negations can override earlier ones

View File

@ -1,4 +1,4 @@
import { apply } from 'json-logic-js'; import { apply, is_logic } from 'json-logic-js';
import Debug from 'debug'; import Debug from 'debug';
import { SchemaRegistry, SchemaID, ObjectSchema } from '../schema/schema'; import { SchemaRegistry, SchemaID, ObjectSchema } from '../schema/schema';
import { Lossless, LosslessViewOne, LosslessViewMany, CollapsedDelta } from '../views/lossless'; import { Lossless, LosslessViewOne, LosslessViewMany, CollapsedDelta } from '../views/lossless';
@ -7,6 +7,21 @@ import { DeltaFilter } from '../core/delta';
const debug = Debug('rz:query'); const debug = Debug('rz:query');
// List of valid JSON Logic operators
const VALID_OPERATORS = new Set([
'==', '===', '!=', '!==', '>', '>=', '<', '<=', '!', '!!',
'and', 'or', 'if', '?:', '??', '!!', '!', '!!', '!!', '!',
'var', 'missing', 'missing_some', 'in', 'cat', 'log', 'method', 'merge',
'+', '-', '*', '/', '%', 'min', 'max', 'map', 'reduce', 'filter', 'all', 'some', 'none'
]);
class InvalidQueryOperatorError extends Error {
constructor(operator: string) {
super(`Invalid query operator: ${operator}`);
this.name = 'InvalidQueryOperatorError';
}
}
export type JsonLogic = Record<string, unknown>; export type JsonLogic = Record<string, unknown>;
export interface QueryOptions { export interface QueryOptions {
@ -29,6 +44,33 @@ export class QueryEngine {
/** /**
* Query entities by schema type with optional JSON Logic filter * Query entities by schema type with optional JSON Logic filter
*/ */
/**
* Validate JSON Logic operators in a filter
* @throws {InvalidQueryOperatorError} If an invalid operator is found
*/
private validateJsonLogicOperators(logic: unknown): void {
if (!logic || typeof logic !== 'object') {
return;
}
const logicObj = logic as Record<string, unknown>;
const operator = Object.keys(logicObj)[0];
// Check if this is an operator
if (is_logic(logic) && operator && !VALID_OPERATORS.has(operator)) {
throw new InvalidQueryOperatorError(operator);
}
// Recursively check nested logic
for (const value of Object.values(logicObj)) {
if (Array.isArray(value)) {
value.forEach(item => this.validateJsonLogicOperators(item));
} else if (value && typeof value === 'object') {
this.validateJsonLogicOperators(value);
}
}
}
async query( async query(
schemaId: SchemaID, schemaId: SchemaID,
filter?: JsonLogic, filter?: JsonLogic,
@ -36,6 +78,19 @@ export class QueryEngine {
): Promise<QueryResult> { ): Promise<QueryResult> {
debug(`Querying schema ${schemaId} with filter:`, filter); debug(`Querying schema ${schemaId} with filter:`, filter);
// Validate filter operators if provided
if (filter) {
try {
this.validateJsonLogicOperators(filter);
} catch (error) {
if (error instanceof InvalidQueryOperatorError) {
debug(`Invalid query operator: ${error.message}`);
throw error; // Re-throw to let the caller handle it
}
throw error;
}
}
// 1. Find all entities that could match this schema // 1. Find all entities that could match this schema
const candidateEntityIds = this.discoverEntitiesBySchema(schemaId); const candidateEntityIds = this.discoverEntitiesBySchema(schemaId);
debug(`Found ${candidateEntityIds.length} candidate entities for schema ${schemaId}`); debug(`Found ${candidateEntityIds.length} candidate entities for schema ${schemaId}`);
@ -124,12 +179,12 @@ export class QueryEngine {
const entity = this.lossless.domainEntities.get(entityId); const entity = this.lossless.domainEntities.get(entityId);
if (!entity) continue; if (!entity) continue;
// Check if entity has deltas for any required property // Check if entity has deltas for all required property
const hasRequiredProperty = requiredProperties.some(propertyId => const hasRequiredProperties = requiredProperties.every(propertyId =>
entity.properties.has(propertyId) entity.properties.has(propertyId)
); );
if (hasRequiredProperty) { if (hasRequiredProperties) {
candidateEntities.push(entityId); candidateEntities.push(entityId);
} }
} }
@ -154,12 +209,14 @@ export class QueryEngine {
} }
const filteredViews: LosslessViewMany = {}; const filteredViews: LosslessViewMany = {};
let hasFilterErrors = false;
const filterErrors: string[] = [];
for (const [entityId, view] of Object.entries(views)) { for (const [entityId, view] of Object.entries(views)) {
try {
// Convert lossless view to queryable object using schema // Convert lossless view to queryable object using schema
const queryableObject = this.losslessViewToQueryableObject(view, schema); const queryableObject = this.losslessViewToQueryableObject(view, schema);
try {
// Apply JSON Logic filter // Apply JSON Logic filter
const matches = apply(filter, queryableObject); const matches = apply(filter, queryableObject);
@ -170,11 +227,20 @@ export class QueryEngine {
debug(`Entity ${entityId} does not match filter`); debug(`Entity ${entityId} does not match filter`);
} }
} catch (error) { } catch (error) {
debug(`Error applying filter to entity ${entityId}:`, error); hasFilterErrors = true;
// Skip entities that cause filter errors const errorMsg = `Error applying filter to entity ${entityId}: ${error instanceof Error ? error.message : String(error)}`;
filterErrors.push(errorMsg);
debug(errorMsg, error);
// Continue processing other entities
} }
} }
// If we had any filter errors, log them as a warning
if (hasFilterErrors) {
console.warn(`Encountered ${filterErrors.length} filter errors. First error: ${filterErrors[0]}`);
debug('All filter errors:', filterErrors);
}
return filteredViews; return filteredViews;
} }

View File

@ -198,46 +198,8 @@ export class SchemaBuilder {
} }
} }
// Common schema patterns // Common schema patterns have been moved to __tests__/test-utils/schemas.ts
export const CommonSchemas = { // since they are only used for testing purposes.
// User schema with friends references
User: () => SchemaBuilder
.create('user')
.name('User')
.description('A user entity with profile information')
.property('name', PrimitiveSchemas.requiredString())
.property('email', PrimitiveSchemas.string())
.property('age', PrimitiveSchemas.number())
.property('active', PrimitiveSchemas.boolean())
.property('friends', ArraySchemas.of(ReferenceSchemas.to('user-summary', 2)))
.required('name')
.build(),
// User summary schema for references to prevent infinite recursion
UserSummary: () => SchemaBuilder
.create('user-summary')
.name('User Summary')
.description('Abbreviated user information for references')
.property('name', PrimitiveSchemas.requiredString())
.property('email', PrimitiveSchemas.string())
.required('name')
.additionalProperties(false)
.build(),
// Document schema
Document: () => SchemaBuilder
.create('document')
.name('Document')
.description('A document with metadata')
.property('title', PrimitiveSchemas.requiredString())
.property('content', PrimitiveSchemas.string())
.property('author', ReferenceSchemas.required('user-summary'))
.property('tags', ArraySchemas.of(PrimitiveSchemas.string()))
.property('created', PrimitiveSchemas.requiredNumber())
.property('published', PrimitiveSchemas.boolean())
.required('title', 'author', 'created')
.build()
} as const;
/** /**
* Context for tracking resolution state during nested object resolution * Context for tracking resolution state during nested object resolution

59
src/test-utils/schemas.ts Normal file
View File

@ -0,0 +1,59 @@
import { SchemaBuilder } from '../../src/schema';
/**
* Common schemas used for testing purposes.
* These schemas are not part of the main application code
* and are only used in test files.
*/
export const CommonSchemas = {
// User schema with friends references
User: () => SchemaBuilder
.create('user')
.name('User')
.description('A user entity with profile information')
.property('name', { type: 'primitive', primitiveType: 'string', required: true })
.property('email', { type: 'primitive', primitiveType: 'string' })
.property('age', { type: 'primitive', primitiveType: 'number' })
.property('active', { type: 'primitive', primitiveType: 'boolean' })
.property('friends', {
type: 'array',
itemSchema: {
type: 'reference',
targetSchema: 'user-summary',
maxDepth: 2
}
})
.required('name')
.build(),
// User summary schema for references to prevent infinite recursion
UserSummary: () => SchemaBuilder
.create('user-summary')
.name('User Summary')
.description('Abbreviated user information for references')
.property('name', { type: 'primitive', primitiveType: 'string', required: true })
.property('email', { type: 'primitive', primitiveType: 'string' })
.build(),
// Document schema
Document: () => SchemaBuilder
.create('document')
.name('Document')
.description('A document with title, content, and author')
.property('title', { type: 'primitive', primitiveType: 'string', required: true })
.property('content', { type: 'primitive', primitiveType: 'string' })
.property('author', {
type: 'reference',
targetSchema: 'user-summary',
maxDepth: 1,
required: true
})
.property('tags', {
type: 'array',
itemSchema: { type: 'primitive', primitiveType: 'string' }
})
.property('created', { type: 'primitive', primitiveType: 'number', required: true })
.property('published', { type: 'primitive', primitiveType: 'boolean' })
.required('title', 'author', 'created')
.build()
} as const;

View File

@ -41,11 +41,32 @@ export abstract class Lossy<Accumulator, Result> {
// apply a filter to the deltas composing that lossless view, // apply a filter to the deltas composing that lossless view,
// and then apply a supplied resolver function which receives // and then apply a supplied resolver function which receives
// the filtered lossless view as input. // the filtered lossless view as input.
// Resolve the current state of the view
resolve(entityIds?: DomainEntityID[]): Result | undefined { resolve(entityIds?: DomainEntityID[]): Result | undefined {
if (!entityIds) { if (!entityIds) {
entityIds = Array.from(this.lossless.domainEntities.keys()); entityIds = Array.from(this.lossless.domainEntities.keys());
} }
// If we don't have an accumulator, build it from the lossless view
if (!this.accumulator) {
this.accumulator = {} as Accumulator;
// Use the general view method to get the full view
const fullView = this.lossless.view(entityIds, this.deltaFilter);
// Build the accumulator by reducing each entity's view
for (const entityId of entityIds) {
const losslessViewOne = fullView[entityId];
if (losslessViewOne) {
if (!this.accumulator) {
this.accumulator = this.initializer(losslessViewOne);
} else {
this.accumulator = this.reducer(this.accumulator, losslessViewOne);
}
}
}
}
if (!this.accumulator) return undefined; if (!this.accumulator) return undefined;
return this.resolver(this.accumulator); return this.resolver(this.accumulator);

View File

@ -64,8 +64,10 @@ export class AggregationResolver extends Lossy<Accumulator, Result> {
super(lossless); super(lossless);
} }
initializer(): Accumulator { initializer(view: LosslessViewOne): Accumulator {
return {}; return {
[view.id]: { id: view.id, properties: {} }
};
} }
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator { reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator {
@ -120,31 +122,7 @@ export class AggregationResolver extends Lossy<Accumulator, Result> {
return res; return res;
} }
// Override resolve to build accumulator on-demand if needed
resolve(entityIds?: DomainEntityID[]): Result | undefined {
if (!entityIds) {
entityIds = Array.from(this.lossless.domainEntities.keys());
}
// If we don't have an accumulator, build it from the lossless view
if (!this.accumulator) {
this.accumulator = this.initializer();
// Use the general view method instead of viewSpecific
const fullView = this.lossless.view(entityIds, this.deltaFilter);
for (const entityId of entityIds) {
const losslessViewOne = fullView[entityId];
if (losslessViewOne) {
this.accumulator = this.reducer(this.accumulator, losslessViewOne);
}
}
}
if (!this.accumulator) return undefined;
return this.resolver(this.accumulator);
}
} }
// Convenience classes for common aggregation types // Convenience classes for common aggregation types

View File

@ -14,7 +14,8 @@ export interface ResolverPlugin<T = unknown> {
update(currentState: T, newValue: PropertyTypes, delta: CollapsedDelta): T; update(currentState: T, newValue: PropertyTypes, delta: CollapsedDelta): T;
// Resolve the final value from the accumulated state // Resolve the final value from the accumulated state
resolve(state: T): PropertyTypes; // Returns undefined if no valid value could be resolved
resolve(state: T): PropertyTypes | undefined;
} }
// Configuration for custom resolver // Configuration for custom resolver
@ -63,8 +64,10 @@ export class CustomResolver extends Lossy<CustomResolverAccumulator, CustomResol
super(lossless); super(lossless);
} }
initializer(): CustomResolverAccumulator { initializer(view: LosslessViewOne): CustomResolverAccumulator {
return {}; return {
[view.id]: { id: view.id, properties: {} }
};
} }
reducer(acc: CustomResolverAccumulator, cur: LosslessViewOne): CustomResolverAccumulator { reducer(acc: CustomResolverAccumulator, cur: LosslessViewOne): CustomResolverAccumulator {
@ -106,8 +109,11 @@ export class CustomResolver extends Lossy<CustomResolverAccumulator, CustomResol
for (const [propertyId, propertyState] of Object.entries(entity.properties)) { for (const [propertyId, propertyState] of Object.entries(entity.properties)) {
const resolvedValue = propertyState.plugin.resolve(propertyState.state); const resolvedValue = propertyState.plugin.resolve(propertyState.state);
// Only add the property if the resolved value is not undefined
if (resolvedValue !== undefined) {
entityResult.properties[propertyId] = resolvedValue; entityResult.properties[propertyId] = resolvedValue;
} }
}
// Only include entities that have at least one resolved property // Only include entities that have at least one resolved property
if (Object.keys(entityResult.properties).length > 0) { if (Object.keys(entityResult.properties).length > 0) {
@ -118,30 +124,7 @@ export class CustomResolver extends Lossy<CustomResolverAccumulator, CustomResol
return res; return res;
} }
// Override resolve to build accumulator on-demand if needed
resolve(entityIds?: DomainEntityID[]): CustomResolverResult | undefined {
if (!entityIds) {
entityIds = Array.from(this.lossless.domainEntities.keys());
}
// If we don't have an accumulator, build it from the lossless view
if (!this.accumulator) {
this.accumulator = this.initializer();
const fullView = this.lossless.view(entityIds, this.deltaFilter);
for (const entityId of entityIds) {
const losslessViewOne = fullView[entityId];
if (losslessViewOne) {
this.accumulator = this.reducer(this.accumulator, losslessViewOne);
}
}
}
if (!this.accumulator) return undefined;
return this.resolver(this.accumulator);
}
} }
// Built-in plugin implementations // Built-in plugin implementations
@ -269,8 +252,8 @@ export class MinPlugin implements ResolverPlugin<{ min?: number }> {
return currentState; return currentState;
} }
resolve(state: { min?: number }): PropertyTypes { resolve(state: { min?: number }): PropertyTypes | undefined {
return state.min || 0; return state.min;
} }
} }
@ -290,7 +273,7 @@ export class MaxPlugin implements ResolverPlugin<{ max?: number }> {
return currentState; return currentState;
} }
resolve(state: { max?: number }): PropertyTypes { resolve(state: { max?: number }): PropertyTypes | undefined {
return state.max || 0; return state.max;
} }
} }

View File

@ -70,8 +70,10 @@ export function lastValueFromDeltas(
} }
export class LastWriteWins extends Lossy<Accumulator, Result> { export class LastWriteWins extends Lossy<Accumulator, Result> {
initializer(): Accumulator { initializer(view: LosslessViewOne): Accumulator {
return {}; return {
[view.id]: { id: view.id, properties: {} }
};
} }
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator { reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator {
@ -81,55 +83,31 @@ export class LastWriteWins extends Lossy<Accumulator, Result> {
for (const [key, deltas] of Object.entries(cur.propertyDeltas)) { for (const [key, deltas] of Object.entries(cur.propertyDeltas)) {
const { value, timeUpdated } = lastValueFromDeltas(key, deltas) || {}; const { value, timeUpdated } = lastValueFromDeltas(key, deltas) || {};
if (!value || !timeUpdated) continue; if (!value || timeUpdated === undefined) continue;
if (timeUpdated > (acc[cur.id].properties[key]?.timeUpdated || 0)) { const currentTime = acc[cur.id].properties[key]?.timeUpdated || 0;
acc[cur.id].properties[key] = { if (timeUpdated > currentTime) {
value, acc[cur.id].properties[key] = { value, timeUpdated };
timeUpdated
};
} }
} }
return acc; return acc;
}; }
resolver(cur: Accumulator): Result { resolver(cur: Accumulator): Result {
const res: Result = {}; const result: Result = {};
for (const [id, ent] of Object.entries(cur)) { for (const [id, entity] of Object.entries(cur)) {
res[id] = {id, properties: {}}; result[id] = {
for (const [key, {value}] of Object.entries(ent.properties)) { id,
res[id].properties[key] = value; properties: Object.fromEntries(
} Object.entries(entity.properties)
} .map(([key, { value }]) => [key, value])
)
return res;
}; };
// Override resolve to build accumulator on-demand if needed
resolve(entityIds?: DomainEntityID[]): Result | undefined {
if (!entityIds) {
entityIds = Array.from(this.lossless.domainEntities.keys());
} }
// If we don't have an accumulator, build it from the lossless view return result;
if (!this.accumulator) {
this.accumulator = this.initializer();
// Use the general view method
const fullView = this.lossless.view(entityIds, this.deltaFilter);
for (const entityId of entityIds) {
const losslessViewOne = fullView[entityId];
if (losslessViewOne) {
this.accumulator = this.reducer(this.accumulator, losslessViewOne);
}
}
}
if (!this.accumulator) return undefined;
return this.resolver(this.accumulator);
} }
} }

View File

@ -72,8 +72,10 @@ export class TimestampResolver extends Lossy<Accumulator, Result> {
super(lossless); super(lossless);
} }
initializer(): Accumulator { initializer(view: LosslessViewOne): Accumulator {
return {}; return {
[view.id]: { id: view.id, properties: {} }
};
} }
reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator { reducer(acc: Accumulator, cur: LosslessViewOne): Accumulator {
@ -124,31 +126,7 @@ export class TimestampResolver extends Lossy<Accumulator, Result> {
return res; return res;
} }
// Override resolve to build accumulator on-demand if needed
resolve(entityIds?: DomainEntityID[]): Result | undefined {
if (!entityIds) {
entityIds = Array.from(this.lossless.domainEntities.keys());
}
// If we don't have an accumulator, build it from the lossless view
if (!this.accumulator) {
this.accumulator = this.initializer();
// Use the general view method instead of viewSpecific
const fullView = this.lossless.view(entityIds, this.deltaFilter);
for (const entityId of entityIds) {
const losslessViewOne = fullView[entityId];
if (losslessViewOne) {
this.accumulator = this.reducer(this.accumulator, losslessViewOne);
}
}
}
if (!this.accumulator) return undefined;
return this.resolver(this.accumulator);
}
} }
// Convenience classes for different tie-breaking strategies // Convenience classes for different tie-breaking strategies

View File

@ -1 +0,0 @@
MANIFEST-000020

View File

@ -1,3 +0,0 @@
2025/06/09-22:08:03.381417 7d18dafbe640 Recovering log #19
2025/06/09-22:08:03.388931 7d18dafbe640 Delete type=3 #18
2025/06/09-22:08:03.389041 7d18dafbe640 Delete type=0 #19

View File

@ -1,3 +0,0 @@
2025/06/09-22:06:22.826797 7d82d53fe640 Recovering log #17
2025/06/09-22:06:22.833921 7d82d53fe640 Delete type=0 #17
2025/06/09-22:06:22.833954 7d82d53fe640 Delete type=3 #16

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -1 +0,0 @@
MANIFEST-000241

View File

@ -1,5 +0,0 @@
2025/06/09-22:08:03.351430 7d18da7bd640 Recovering log #240
2025/06/09-22:08:03.351481 7d18da7bd640 Level-0 table #242: started
2025/06/09-22:08:03.353466 7d18da7bd640 Level-0 table #242: 877 bytes OK
2025/06/09-22:08:03.359635 7d18da7bd640 Delete type=0 #240
2025/06/09-22:08:03.359683 7d18da7bd640 Delete type=3 #238

View File

@ -1,5 +0,0 @@
2025/06/09-22:08:03.334848 7d18da7bd640 Recovering log #237
2025/06/09-22:08:03.334894 7d18da7bd640 Level-0 table #239: started
2025/06/09-22:08:03.337138 7d18da7bd640 Level-0 table #239: 855 bytes OK
2025/06/09-22:08:03.344340 7d18da7bd640 Delete type=0 #237
2025/06/09-22:08:03.344389 7d18da7bd640 Delete type=3 #235