You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi. This is a great library and well-documented. I have a question I hope you can help with and would definitely help others.
The following is a minimal example that demonstrates my issue. The code iterates through batches of 100 keys and removes them from the database. The memory usage grows steadily until all keys are removed. Once all keys are removed, memory usage remains the same. Memory is being retained and not being released. When I execute the code, there is a steady climb from <50M to >490M memory. Once all keys are removed (over 110,000), memory growth stops but memory usage remains the same at > 490M.
Is there anything you can see that I am doing wrong?
import { open } from 'lmdb';
const opts = { path: './testdb', compression: true };
const db = open(opts);
const wait = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
while (true) {
await wait(50);
// Collect keys
let keys = [];
const iter = db.getKeys({ limit: 100, snapshot: true });
for (const key of iter) {
keys.push(key);
}
// Remove keys
keys.forEach((key) => {
db.remove(key);
console.log('Key:', key);
});
keys = undefined;
}
await db.close();
The text was updated successfully, but these errors were encountered:
Hi. This is a great library and well-documented. I have a question I hope you can help with and would definitely help others.
The following is a minimal example that demonstrates my issue. The code iterates through batches of 100 keys and removes them from the database. The memory usage grows steadily until all keys are removed. Once all keys are removed, memory usage remains the same. Memory is being retained and not being released. When I execute the code, there is a steady climb from <50M to >490M memory. Once all keys are removed (over 110,000), memory growth stops but memory usage remains the same at > 490M.
Is there anything you can see that I am doing wrong?
The text was updated successfully, but these errors were encountered: