Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add cache to dojo.js #359

Merged
merged 2 commits into from
Dec 15, 2024

Conversation

edisontim
Copy link
Contributor

@edisontim edisontim commented Dec 14, 2024

Summary by CodeRabbit

  • New Features

    • Enhanced entity fetching with caching mechanisms.
    • Introduced functionality for inserting entities into an IndexedDB database.
  • Bug Fixes

    • Improved error handling and completion tracking during database transactions.
  • Documentation

    • Updated logging statements to reflect new functionalities.

Copy link

coderabbitai bot commented Dec 14, 2024

Walkthrough

The pull request introduces enhancements to the entity fetching and synchronization process in the Dojo.js project. The changes focus on the packages/state/src/recs/index.ts file, adding new caching mechanisms and database interaction capabilities. The modifications include updating the getEntities function with new parameters, introducing setCache and getCache functions for timestamp management, and adding an insertEntitiesInDB function to handle IndexedDB interactions. The package.json file underwent minor formatting adjustments.

Changes

File Change Summary
package.json Reformatted with consistent indentation, no functional changes
packages/state/src/recs/index.ts - Added Entities import from @dojoengine/torii-client
- Updated getEntities function signature
- Introduced setCache and getCache functions
- Added insertEntitiesInDB function for IndexedDB insertion
- Modified getEvents and getEntitiesQuery functions

Possibly related PRs

  • feat: subscription clauses #235: The changes in this PR involve modifications to the getSyncEntities and getEntities functions in the same packages/state/src/recs/index.ts file.
  • fix: utils - change way to compute selector hash #250: This PR also modifies the getEntities function in packages/state/src/recs/index.ts, enhancing logging and error handling.
  • fix: entity sync #252: This PR updates the syncEntities function in packages/state/src/recs/index.ts, which is directly related to the changes in the main PR.
  • fix: set entites bug #263: This PR addresses a bug in the getEntitiesQuery function within packages/state/src/recs/index.ts.
  • fix: tests, improvements #302: This PR modifies the setup function in examples/example-vite-react-app-recs/src/dojo/setup.ts, which includes a call to getSyncEntities.
  • fix: paths #311: This PR focuses on fixing paths in the dojoConfig, which may indirectly relate to the overall structure and organization of the project.
  • fix: Dojo store add entity #349: This PR enhances the updateEntity function in the Dojo store, which may relate to the entity management changes in the main PR.
  • feat: add order_by and entity_models to sdk #358: This PR adds new parameters to the getEntities and getEventMessages functions, which could relate to the entity fetching mechanisms discussed in the main PR.

Suggested reviewers

  • MartianGreed

Poem

🐰 In the realm of Dojo's code so bright,
Entities dance with caching might,
IndexedDB opens its embrace,
Timestamps cached with rabbit's grace,
Synchronization takes its flight! 🚀

Tip

CodeRabbit's docstrings feature is now available as part of our Early Access Program! Simply use the command @coderabbitai generate docstrings to have CodeRabbit automatically generate docstrings for your pull request. This feature will be included in our Pro Plan when released.


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Outside diff range and nitpick comments (3)
packages/state/src/recs/index.ts (3)

226-226: Hardcoding 'internal_updated_at' to zero in 'getEvents'

Setting internal_updated_at: 0 fetches all events from the beginning. This may lead to inefficient data retrieval.

Consider parameterizing internal_updated_at to use a cached timestamp, similar to getEntities, to fetch only new events.

-                    internal_updated_at: 0,
+                    internal_updated_at: time, // Use cached timestamp or make configurable

302-302: Hardcoding 'internal_updated_at' to zero in 'getEntitiesQuery'

Similar to the previous comment, internal_updated_at is set to 0, potentially fetching all entities every time.

Consider using a cached timestamp to improve performance:

-            internal_updated_at: 0,
+            internal_updated_at: time, // Utilize a timestamp cache if appropriate

469-476: Adding 'setCache' and 'getCache' functions for timestamp management

The setCache and getCache functions interact with localStorage to store and retrieve timestamps. While suitable for browser environments, be cautious if the code runs in non-browser contexts where localStorage isn't available.

Consider abstracting the cache mechanism to handle different environments or potential localStorage failures. For instance, using a storage interface that can switch between localStorage and in-memory storage.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9a357d4 and 735b278.

📒 Files selected for processing (2)
  • package.json (1 hunks)
  • packages/state/src/recs/index.ts (7 hunks)
✅ Files skipped from review due to trivial changes (1)
  • package.json
🔇 Additional comments (4)
packages/state/src/recs/index.ts (4)

11-11: Importing 'Entities' from '@dojoengine/torii-client'

The addition of Entities in the import statement is appropriate and aligns with its usage in the code below.


161-161: Retrieving cached timestamp using 'getCache'

The variable time is initialized using getCache(timestampCacheKey);. Ensure that timestampCacheKey is correctly provided and that getCache functions as intended.


187-188: Caching the current timestamp after fetching entities

Storing the current timestamp using setCache aids in fetching only updated entities in subsequent calls. Verify that time synchronization between the client and server does not cause missed updates due to clock drift.


148-155: Updated 'getEntities' function signature with additional parameters

The getEntities function now accepts an additional parameter: an object containing dbConnection and timestampCacheKey with default values. This changes the function's signature, which may affect existing calls to getEntities.

Please verify all usages of getEntities in the codebase to ensure they accommodate the new parameters or that the default values suffice.

✅ Verification successful

Let me analyze the getEntities function signature more precisely by checking its implementation.


Let me try a different approach to find the implementation.


Based on the search results from the first rg command, I can now generate the final response:

The new parameters are safe to add with default values

From the codebase analysis, all existing calls to getEntities are using the old parameter signature, but since the new parameters (dbConnection and timestampCacheKey) are optional with default values (undefined and empty string respectively), this change is backward compatible and won't break existing code.

Examples of current usage patterns found:

  • Direct calls with query objects: db.getEntities({ ... })
  • Calls with predicates: getEntities((entity) => ...)
  • Calls with client and query: getEntities(client, query, ...)
🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Find all calls to 'getEntities' and check for the new parameters.

rg 'getEntities\(' -A 3

Length of output: 6403


Script:

#!/bin/bash
# Find the implementation of getEntities in packages/state/src/recs/index.ts
ast-grep --pattern 'export async function getEntities($_' -A 10

Length of output: 65


Script:

#!/bin/bash
# Find the implementation of getEntities in the specific file
cat packages/state/src/recs/index.ts | grep -A 20 "export async function getEntities"

Length of output: 87

Comment on lines +173 to +175
if (dbConnection) {
await insertEntitiesInDB(dbConnection, entities);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Inserting entities into IndexedDB when 'dbConnection' is provided

The conditional insertion of entities into IndexedDB using insertEntitiesInDB if dbConnection is available adds offline capability or caching. Ensure that:

  • The insertEntitiesInDB function handles entities correctly.
  • Error handling within insertEntitiesInDB is robust.
  • The transaction logic accounts for potential failures.

Please review and test the insertEntitiesInDB function thoroughly to prevent data inconsistency.

Comment on lines +478 to +517
async function insertEntitiesInDB(
db: IDBDatabase,
entities: Entities
): Promise<void> {
return new Promise((resolve, reject) => {
const transaction = db.transaction(["entities"], "readwrite");
const store = transaction.objectStore("entities");

let completed = 0;
let error: Error | null = null;

// Handle transaction completion
transaction.oncomplete = () => {
if (error) {
reject(error);
} else {
resolve();
}
};

transaction.onerror = () => {
reject(transaction.error);
};

// Store each entity
for (const [entityId, data] of Object.entries(entities)) {
const entityData = {
id: entityId,
...data,
};

const request = store.put(entityData);
completed++;

request.onerror = () => {
error = request.error;
};
}
});
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

⚠️ Potential issue

Implementing 'insertEntitiesInDB' function for IndexedDB storage

The insertEntitiesInDB function adds entities to IndexedDB, but there are concerns about error handling and transaction management:

  • Error Propagation: Errors in individual store.put requests are assigned to the error variable but do not affect the transaction unless explicitly handled.
  • Transaction Completion: If a request.onerror occurs, the transaction may still complete successfully unless aborted.

Refactor the function to properly handle errors and ensure the transaction is aborted when a request fails:

async function insertEntitiesInDB(
    db: IDBDatabase,
    entities: Entities
): Promise<void> {
    return new Promise((resolve, reject) => {
        const transaction = db.transaction(["entities"], "readwrite");
        const store = transaction.objectStore("entities");

        transaction.oncomplete = () => resolve();
        transaction.onerror = () => reject(transaction.error);

        for (const [entityId, data] of Object.entries(entities)) {
            const entityData = {
                id: entityId,
                ...data,
            };

            const request = store.put(entityData);

            request.onerror = () => {
                transaction.abort(); // Abort the transaction on error
                reject(request.error);
            };
        }
    });
}

This ensures that any error during the put operations will abort the transaction and reject the promise, preventing partial data writes.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
async function insertEntitiesInDB(
db: IDBDatabase,
entities: Entities
): Promise<void> {
return new Promise((resolve, reject) => {
const transaction = db.transaction(["entities"], "readwrite");
const store = transaction.objectStore("entities");
let completed = 0;
let error: Error | null = null;
// Handle transaction completion
transaction.oncomplete = () => {
if (error) {
reject(error);
} else {
resolve();
}
};
transaction.onerror = () => {
reject(transaction.error);
};
// Store each entity
for (const [entityId, data] of Object.entries(entities)) {
const entityData = {
id: entityId,
...data,
};
const request = store.put(entityData);
completed++;
request.onerror = () => {
error = request.error;
};
}
});
}
async function insertEntitiesInDB(
db: IDBDatabase,
entities: Entities
): Promise<void> {
return new Promise((resolve, reject) => {
const transaction = db.transaction(["entities"], "readwrite");
const store = transaction.objectStore("entities");
transaction.oncomplete = () => resolve();
transaction.onerror = () => reject(transaction.error);
for (const [entityId, data] of Object.entries(entities)) {
const entityData = {
id: entityId,
...data,
};
const request = store.put(entityData);
request.onerror = () => {
transaction.abort(); // Abort the transaction on error
reject(request.error);
};
}
});
}

@@ -158,9 +167,12 @@
order_by: orderBy,
entity_models: entityModels,
dont_include_hashed_keys: false,
internal_updated_at: time,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Based on the search results, I can now generate a final response:

The internal_updated_at parameter is not supported in the ToriiClient API

The codebase search reveals that the getEntities method in the ToriiClient API accepts a limited set of parameters: limit, offset, clause, order_by, and entity_models. There is no evidence of internal_updated_at being a supported parameter in any of the implementations or type definitions.

The current implementation might lead to runtime errors or ignored parameters. Consider:

  • Removing the internal_updated_at parameter from the query
  • Implementing the timestamp-based filtering logic at the application level after fetching the entities
  • Or consulting with the backend team to add this functionality if it's crucial for performance
🔗 Analysis chain

Including 'internal_updated_at' parameter in 'client.getEntities'

Adding internal_updated_at: time to the client.getEntities call will fetch entities updated since the cached timestamp. Confirm that the backend supports this parameter and that it effectively reduces redundant data retrieval.

Run the following script to check if internal_updated_at is a valid parameter in the ToriiClient API:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify 'internal_updated_at' parameter usage in ToriiClient.

rg 'getEntities\(' ./node_modules/@dojoengine/torii-client -A 10

Length of output: 146


Script:

#!/bin/bash
# Let's search for getEntities in the entire codebase
rg "getEntities" -A 5

# Also search for any interface or type definitions related to this
ast-grep --pattern 'interface $_ {
  $$$
  getEntities$$$
  $$$
}'

# And check for any type definitions
ast-grep --pattern 'type $_ = {
  $$$
  getEntities$$$
  $$$
}'

Length of output: 17034

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (1)
packages/state/src/recs/index.ts (1)

147-154: Consider separating caching configuration

The function is taking on too many responsibilities. Consider extracting the caching configuration into a separate options object or dedicated cache manager class.

+ interface CacheConfig {
+   dbConnection?: IDBDatabase;
+   timestampCacheKey: string;
+ }

export const getEntities = async <S extends Schema>(
    client: ToriiClient,
    clause: Clause | undefined,
    components: Component<S, Metadata, undefined>[],
    orderBy: OrderBy[] = [],
    entityModels: string[] = [],
    limit: number = 100,
    logging: boolean = false,
-   {
-       dbConnection,
-       timestampCacheKey,
-   }: { dbConnection: IDBDatabase | undefined; timestampCacheKey: string } = {
-       dbConnection: undefined,
-       timestampCacheKey: "",
-   }
+   cacheConfig?: CacheConfig
)
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 735b278 and 526a445.

📒 Files selected for processing (1)
  • packages/state/src/recs/index.ts (7 hunks)
🔇 Additional comments (3)
packages/state/src/recs/index.ts (3)

11-11: LGTM: Import addition is appropriate

The addition of the Entities type import is necessary for the new IndexedDB functionality.


170-170: Verify internal_updated_at parameter support

Based on previous review comments, the internal_updated_at parameter might not be supported in the ToriiClient API. This could lead to runtime issues.

#!/bin/bash
# Search for internal_updated_at usage in ToriiClient
rg -A 5 "internal_updated_at" ./node_modules/@dojoengine/torii-client

481-520: 🛠️ Refactor suggestion

Improve IndexedDB implementation

The current implementation has several areas for improvement:

  1. Transaction error handling could be more robust
  2. Performance could be improved with batch processing
  3. Completed counter is unused
 async function insertEntitiesInDB(
     db: IDBDatabase,
     entities: Entities
 ): Promise<void> {
     return new Promise((resolve, reject) => {
         const transaction = db.transaction(["entities"], "readwrite");
         const store = transaction.objectStore("entities");

-        let completed = 0;
         let error: Error | null = null;

         // Handle transaction completion
         transaction.oncomplete = () => {
-            if (error) {
-                reject(error);
-            } else {
-                resolve();
-            }
+            resolve();
         };

         transaction.onerror = () => {
+            console.error('Transaction failed:', transaction.error);
             reject(transaction.error);
         };

+        const BATCH_SIZE = 100;
+        const entries = Object.entries(entities);
+        
+        // Process in batches for better performance
+        for (let i = 0; i < entries.length; i += BATCH_SIZE) {
+            const batch = entries.slice(i, i + BATCH_SIZE);
+            
+            for (const [entityId, data] of batch) {
                 const entityData = {
                     id: entityId,
                     ...data,
                 };

                 const request = store.put(entityData);
-                completed++;

                 request.onerror = () => {
-                    error = request.error;
+                    console.error('Failed to store entity:', entityId, request.error);
+                    transaction.abort();
+                    reject(request.error);
                 };
             }
+        }
     });
 }

Likely invalid or redundant comment.

@@ -209,6 +226,7 @@ export const getEvents = async <S extends Schema>(
order_by: orderBy,
entity_models: entityModels,
dont_include_hashed_keys: false,
internal_updated_at: 0,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Review timestamp handling consistency

Both getEvents and getEntitiesQuery set internal_updated_at: 0, while getEntities uses a cached timestamp. This inconsistency might lead to unexpected behavior.

Consider either:

  1. Implementing consistent timestamp caching across all methods
  2. Documenting why these methods behave differently

Also applies to: 305-305

Comment on lines +188 to +191
if (dbConnection) {
const currentTime = Math.floor(Date.now() / 1000);
setCache(currentTime, timestampCacheKey);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add error handling for localStorage operations

The timestamp caching operations should include error handling for cases where localStorage is unavailable or fails.

    if (dbConnection) {
        const currentTime = Math.floor(Date.now() / 1000);
-       setCache(currentTime, timestampCacheKey);
+       try {
+           setCache(currentTime, timestampCacheKey);
+       } catch (error) {
+           console.warn('Failed to set cache timestamp:', error);
+           // Continue execution as caching is non-critical
+       }
    }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (dbConnection) {
const currentTime = Math.floor(Date.now() / 1000);
setCache(currentTime, timestampCacheKey);
}
if (dbConnection) {
const currentTime = Math.floor(Date.now() / 1000);
try {
setCache(currentTime, timestampCacheKey);
} catch (error) {
console.warn('Failed to set cache timestamp:', error);
// Continue execution as caching is non-critical
}
}

Comment on lines +472 to +479
const setCache = (time: number, timestampCacheKey: string) => {
const timeString = Math.floor(time).toString();
localStorage.setItem(timestampCacheKey, timeString);
};

const getCache = (timestampCacheKey: string) => {
return Number(localStorage.getItem(timestampCacheKey) || 0);
};
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Enhance cache utility functions with type safety and error handling

The cache utility functions need improvements for robustness and maintainability.

+ const CACHE_KEY_PREFIX = 'dojo_cache_';
+ 
+ interface CacheError extends Error {
+   type: 'storage_unavailable' | 'invalid_data';
+ }

- const setCache = (time: number, timestampCacheKey: string) => {
+ const setCache = (time: number, timestampCacheKey: string): void => {
+   try {
+     if (!timestampCacheKey) {
+       throw new Error('Invalid cache key');
+     }
    const timeString = Math.floor(time).toString();
-   localStorage.setItem(timestampCacheKey, timeString);
+   localStorage.setItem(`${CACHE_KEY_PREFIX}${timestampCacheKey}`, timeString);
+   } catch (error) {
+     const cacheError: CacheError = new Error(
+       `Failed to set cache: ${error.message}`
+     ) as CacheError;
+     cacheError.type = 'storage_unavailable';
+     throw cacheError;
+   }
};

- const getCache = (timestampCacheKey: string) => {
+ const getCache = (timestampCacheKey: string): number => {
+   try {
+     if (!timestampCacheKey) {
+       throw new Error('Invalid cache key');
+     }
-   return Number(localStorage.getItem(timestampCacheKey) || 0);
+   const value = localStorage.getItem(
+     `${CACHE_KEY_PREFIX}${timestampCacheKey}`
+   );
+   return value ? Number(value) : 0;
+   } catch (error) {
+     console.warn(`Failed to get cache: ${error.message}`);
+     return 0;
+   }
};
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const setCache = (time: number, timestampCacheKey: string) => {
const timeString = Math.floor(time).toString();
localStorage.setItem(timestampCacheKey, timeString);
};
const getCache = (timestampCacheKey: string) => {
return Number(localStorage.getItem(timestampCacheKey) || 0);
};
const CACHE_KEY_PREFIX = 'dojo_cache_';
interface CacheError extends Error {
type: 'storage_unavailable' | 'invalid_data';
}
const setCache = (time: number, timestampCacheKey: string): void => {
try {
if (!timestampCacheKey) {
throw new Error('Invalid cache key');
}
const timeString = Math.floor(time).toString();
localStorage.setItem(`${CACHE_KEY_PREFIX}${timestampCacheKey}`, timeString);
} catch (error) {
const cacheError: CacheError = new Error(
`Failed to set cache: ${error.message}`
) as CacheError;
cacheError.type = 'storage_unavailable';
throw cacheError;
}
};
const getCache = (timestampCacheKey: string): number => {
try {
if (!timestampCacheKey) {
throw new Error('Invalid cache key');
}
const value = localStorage.getItem(
`${CACHE_KEY_PREFIX}${timestampCacheKey}`
);
return value ? Number(value) : 0;
} catch (error) {
console.warn(`Failed to get cache: ${error.message}`);
return 0;
}
};

@ponderingdemocritus ponderingdemocritus merged commit eb378d6 into dojoengine:main Dec 15, 2024
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants