Skip to content

Commit

Permalink
feat: use Symbol to hold the method batcher on decorator + simplified…
Browse files Browse the repository at this point in the history
… README.md usage (#4)

* feat: use Symbol to hold the method batcher on decorator + simplified README.md usage

* improved README.md

* NPM_TOKEN
  • Loading branch information
onhate authored Aug 17, 2023
1 parent 3f7bea3 commit 901c009
Show file tree
Hide file tree
Showing 7 changed files with 80 additions and 87 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/npm-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,4 +31,4 @@ jobs:
- run: npm run build
- run: npm publish
env:
NODE_AUTH_TOKEN: ${{secrets.npm_token}}
NODE_AUTH_TOKEN: ${{secrets.NPM_TOKEN}}
116 changes: 52 additions & 64 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,18 +9,18 @@ backends and reduce requests to those backends via batching.
This library is especially useful for scenarios where you need to perform multiple asynchronous operations efficiently,
such as when making network requests or performing database queries.

Heavily inspired by [graphql/dataloader](https://github.com/graphql/dataloader) but using classes and decorators 😜
Heavily inspired by [graphql/dataloader](https://github.com/graphql/dataloader) but simpler using decorators (😜 really
decoupled). Because of that the
rest of your application doesn't event need to know about the batching/dataloader, it just works!

## Table of Contents

- [Installation](#installation)
- [Usage](#usage)
- [Basic Usage](#basic-usage)
- [Using the `@InBatches` Decorator](#using-the-inbatches-decorator)
- [Basic usage with `@InBatches` Decorator](#basic-usage-with-inbatches-decorator)
- [Advanced usage with custom `Batcher` class](#advanced-usage-with-custom-batcher-class)
- [API](#api)
- [`BatcherOptions`](#batcheroptions)
- [`Batcher<K, V>` Class](#batcherk-v-class)
- [`InBatches<K, V>` Decorator](#inbatches-decorator)
- [Contributing](#contributing)
- [License](#license)

Expand All @@ -30,29 +30,45 @@ Heavily inspired by [graphql/dataloader](https://github.com/graphql/dataloader)
npm install inbatches
```

or

```bash
yarn add inbatches
```

## Usage

### Using the `Batcher` Class
### Basic usage with `@InBatches` Decorator

The simplest way to get the grown running is to use the `@InBatches` decorator. This decorator will wrap your method
and will batch-enable it, like magic! 🧙‍♂️

```typescript
import { Batcher } from 'inbatches';
import { InBatches } from 'inbatches';

// Define a class that extends Batcher and implements the `run` method
// the `run` method will be called with an array of keys collected from the `enqueue` method
class MyBatcher extends Batcher<number, string> {
async run(ids: number[]): Promise<string[]> {
// Perform asynchronous operations using the keys
// you must return an array of results in the same order as the keys
return this.db.getMany(ids);
class MyService {

// (optional) overloaded method, where you define the keys as `number` and the return type as `User` for typings
async fetch(key: number): Promise<User>;

// This method is now batch-enabled
@InBatches()
async fetch(keys: number | number[]): Promise<User | User[]> {
if (Array.isArray(keys)) return await this.db.getMany(keys);

// in reality the Decorator will wrap this method and it will never be called with a single key :)
throw new Error('It will never be called with a single key 😉');
}
}
```

// Create an instance of your batcher
const batcher = new MyBatcher();
Profit! 🤑

```typescript
const service = new MyService();

// Enqueue keys for batched execution
const result = [1, 2, 3, 4, 5].map(async id => {
return await batcher.enqueue(id);
return await service.fetch(id);
});

// The result will be an array of results in the same order as the keys
Expand All @@ -61,34 +77,33 @@ result.then(results => {
});
```

### Using the `@InBatches` Decorator
### Advanced usage with custom `Batcher` class

The library also provides a decorator called `InBatches` that you can use to batch-enable methods of your class.
Another way to use the library is to create a class that extends the `Batcher` class and implement the `run` method.
This class will provide a `enqueue` method that you can use to enqueue keys for batched execution.

```typescript
import { InBatches } from 'inbatches';

class MyService {

// (optional) overloaded method, where you define the keys as `number` and the return type as `string` for typings
async fetch(keys: number): Promise<string>;

// in reality the Decorator will wrap this method and it will never be called with a single key :)
@InBatches() // This method is now batch-enabled
async fetch(keys: number | number[]): Promise<string | string[]> {
if (Array.isArray(keys)) {
return this.db.getMany(keys);
}
import { Batcher } from 'inbatches';

// the Decorator will wrap this method and because of that it will never be called with a single key
throw new Error('It will never be called with a single key 😉');
// The `run` method will be called with an array of keys collected from the `enqueue` method
class MyBatcher extends Batcher<number, string> {
async run(ids: number[]): Promise<string[]> {
// Perform asynchronous operations using the keys
// you must return an array of results in the same order as the keys
return this.db.getMany(ids);
}
}
```

const service = new MyService();
then

```typescript
// Create an instance of your batcher
const batcher = new MyBatcher();

// Enqueue keys for batched execution
const result = [1, 2, 3, 4, 5].map(async id => {
return await service.fetch(id);
return await batcher.enqueue(id);
});

// The result will be an array of results in the same order as the keys
Expand All @@ -108,33 +123,6 @@ An interface to specify options for the batcher.
is `undefined` and will use `process.nextTick` to dispatch the batch, which is highly efficient and fast. Only use
this if you really want to accumulate promises calls in a window of time before dispatching the batch.

### `Batcher<K, V>` Class

An abstract class that provides the core functionality for batching and executing asynchronous operations.

- `enqueue(key: K): Promise<V>`: Enqueues a key for batching and returns a promise that resolves to the result when
available.

### `InBatches` Decorator

A decorator function that can be applied to methods to enable batching.

- Usage: `@InBatches(options?: BatcherOptions)`
- Example:

```typescript
class MyService {

// (optional) overloaded method, where you define the keys as `number` and the return type as `string` for typings
async fetchResults(keys: number): Promise<string>

@InBatches({ maxBatchSize: 10 })
async fetchResults(keys: number | number[]): Promise<string | string[]> {
// Batch-enabled method logic
}
}
```

## Contributing

Contributions are welcome! Feel free to open issues or submit pull requests on
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "inbatches",
"version": "0.0.7",
"version": "0.0.8",
"private": false,
"license": "MIT",
"repository": {
Expand Down
14 changes: 7 additions & 7 deletions src/batcher.spec.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import { Batcher, BatcherOptions } from './batcher';

class BatcherSpec extends Batcher<string, string> {
class RunInBatches extends Batcher<string, string> {
constructor(options?: BatcherOptions) {
super(options);
}
Expand All @@ -16,7 +16,7 @@ class BatcherSpec extends Batcher<string, string> {

describe('Batcher', () => {
it('should call run in batch', async () => {
const batcher = new BatcherSpec();
const batcher = new RunInBatches();

const promises = ['a', 'b', 'c'].map(key => {
return batcher.enqueue(key);
Expand All @@ -27,7 +27,7 @@ describe('Batcher', () => {
});

it('should call run in batch with max size', async () => {
const batcher = new BatcherSpec({ maxBatchSize: 2 });
const batcher = new RunInBatches({ maxBatchSize: 2 });

const promises = ['batch1.1', 'batch1.2', 'batch2.1', 'batch2.2'].map(key => {
return batcher.enqueue(key);
Expand All @@ -38,7 +38,7 @@ describe('Batcher', () => {
});

it('should call run method with unique keys if duplicates', async () => {
const batcher = new BatcherSpec();
const batcher = new RunInBatches();

const promises = ['a', 'b', 'a', 'c'].map(key => {
return batcher.enqueue(key);
Expand All @@ -49,7 +49,7 @@ describe('Batcher', () => {
});

it('should reject all with same error when run method failed', async () => {
const batcher = new BatcherSpec();
const batcher = new RunInBatches();

const promises = ['a', 'throw', 'c'].map(key => {
return batcher.enqueue(key);
Expand All @@ -63,7 +63,7 @@ describe('Batcher', () => {
});

it('should reject single with returned error when returning error', async () => {
const batcher = new BatcherSpec();
const batcher = new RunInBatches();

const promises = ['a', 'error', 'c'].map(key => {
return batcher.enqueue(key);
Expand All @@ -76,7 +76,7 @@ describe('Batcher', () => {
});

it('should call in batches with delay', cb => {
const batcher = new BatcherSpec({ delayWindowInMs: 100 });
const batcher = new RunInBatches({ delayWindowInMs: 100 });

const promises = ['a', 'b', 'c'].map(key => {
return batcher.enqueue(key);
Expand Down
8 changes: 4 additions & 4 deletions src/batcher.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,19 +11,19 @@ interface Callback<K> {

class Batch<K, V> {
public active = true;
public readonly cache = new Map<K, Promise<V>>();
public readonly uniques = new Map<K, Promise<V>>();
public readonly callbacks: Callback<K>[] = [];

append(key: K) {
if (this.cache.has(key)) {
return this.cache.get(key);
if (this.uniques.has(key)) {
return this.uniques.get(key);
}

const promise = new Promise<V>((resolve, reject) => {
this.callbacks.push({ key, resolve, reject });
});

this.cache.set(key, promise);
this.uniques.set(key, promise);
return promise;
}
}
Expand Down
1 change: 0 additions & 1 deletion src/decorator.spec.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@ class RunInBatches {
constructor(private id: string = '') {
}


async getAll(keys: string): Promise<string>;

@InBatches()
Expand Down
24 changes: 15 additions & 9 deletions src/decorator.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,22 +16,28 @@ class MethodBatcher<I, K, V> extends Batcher<K, V> {
}
}

function getInstanceBatcher<K, V>(self: any, property: string, fn: Method<K, V>, options?: BatcherOptions) {
const bkey = `${property}_____batcher`;
// this Symbol is used to store the MethodBatcher instances in the instance of the class that is using the decorator
// this way we can have a unique batcher for each instance and method of the class decorated with @InBatches
const holder = Symbol('__inbatches__');

// check if the instance already has a batcher for this method
if (self[bkey]) return self[bkey];
function getInstanceBatcher<I, K, V>(instance: I, property: string, descriptor: Method<K, V>, options?: BatcherOptions) {
// check if the instance already has a holder for all the batchers in the class
instance[holder] = instance[holder] ?? new Map<string, MethodBatcher<I, K, V>>();

// otherwise, create a new batcher and store it in the instance so it is unique for that instance
self[bkey] = new MethodBatcher(self, fn, options);
return self[bkey];
// check if the instance already has a method matcher for this specific method
if (instance[holder].has(property)) return instance[holder].get(property);

// otherwise, create a new batcher and store it in the instance batchers holder
const batcher = new MethodBatcher<I, K, V>(instance, descriptor, options);
instance[holder].set(property, batcher);
return batcher;
}

export function InBatches<K, V>(options?: BatcherOptions) {
return function (_: any, property: string, descriptor: PropertyDescriptor) {
const fn = descriptor.value;
const method = descriptor.value;
descriptor.value = function (...args: any[]) {
const batcher = getInstanceBatcher<K, V>(this, property, fn, options);
const batcher = getInstanceBatcher<any, K, V>(this, property, method, options);
return batcher.enqueue(args);
};

Expand Down

0 comments on commit 901c009

Please sign in to comment.