Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remaining documents #395

Open
wants to merge 2 commits into
base: dubbo3
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
70 changes: 70 additions & 0 deletions docs/guide/dubboForWEB/Choosingaprotocol.md
Original file line number Diff line number Diff line change
@@ -1 +1,71 @@
# Choosing a protocol

In addition to the Dubbo protocol, Dubbo ships with support for the gRPC-web protocol. If your backend does not support the Dubbo protocol, you can still use Dubbo clients to interface with it.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps the Triple protocol is more relevant to the context


## Connect
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Triple


The function `createDubboTransport()` creates a transport for the Dubbo protocol. It uses the [fetch API](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) for the actual network operations, which is widely supported in web browsers. The most important options for the Dubbo transport are as follows:

```ts
import { createDubboTransport } from "@apachedubbo/dubbo-web";

const transport = createDubboTransport({
// Requests will be made to <baseUrl>/<package>.<service>/method
baseUrl: "http://localhost:8080",

// By default, this transport uses the JSON format.
// Set this option to true to use the binary format.
useBinaryFormat: false,

// Controls what the fetch client will do with credentials, such as
// Cookies. The default value is "same-origin", which will not
// transmit Cookies in cross-origin requests.
credentials: "same-origin",

// Interceptors apply to all calls running through this transport.
interceptors: [],

// By default, all requests use POST. Set this option to true to use GET
// for side-effect free RPCs.
useHttpGet: false,

// Optional override of the fetch implementation used by the transport.
fetch: globalThis.fetch;
});
```

We generally recommend the JSON format for web browsers, because it makes it trivial to follow what exactly is sent over the wire with the browser's network inspector.

Dubbo supports optionally using HTTP GET requests for side-effect free RPC calls, to enable easy use of request caching and more. For more information on HTTP GET requests, see **Get Requests**.

When creating your transport, you have the option of providing your own custom Fetch function. This can be useful for many different scenarios, such as a overriding or setting Fetch properties as well as working with frameworks such as Svelte that come bundled with their own Fetch implementation.

It is also possible to configure individual Fetch properties when issuing requests. For some examples of customizing Fetch within Dubbo, take a look at our **Interceptors** docs. In addition, to see how to work with Fetch in an SSR context with frameworks such as Svelte and Nextjs, visit our documentation on **SSR**.

## gRPC-web

The function `createGrpcWebTransport()` creates a Transport for the gRPC-web protocol. Any gRPC service can be made available to gRPC-web with the [Envoy Proxy](https://www.envoyproxy.io/). ASP.NET Core supports gRPC-web with a [middleware](https://docs.microsoft.com/en-us/aspnet/core/grpc/browser?view=aspnetcore-6.0). Dubbo for Node and `dubbo-go` support gRPC-web out of the box.

```ts
import { createGrpcWebTransport } from "@apachedubbo/dubbo-web";

const transport = createGrpcWebTransport({
// Requests will be made to <baseUrl>/<package>.<service>/method
baseUrl: "http://localhost:8080",

// By default, this transport uses the binary format, because
// not all gRPC-web implementations support JSON.
useBinaryFormat: true,

// Controls what the fetch client will do with credentials, such as
// Cookies. The default value is "same-origin", which will not
// transmit Cookies in cross-origin requests.
credentials: "include",

// Interceptors apply to all calls running through this transport.
interceptors: [],

// Optional override of the fetch implementation used by the transport.
fetch: globalThis.fetch;
});
```
125 changes: 124 additions & 1 deletion docs/guide/dubboForWEB/GeneratingCode.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,125 @@
# Generating code
# todo

We mentioned earlier that the ELIZA service defines a Protocol Buffer schema. So what *is* that schema? It is really just a simple file that describes the service, its methods, and their argument and return types:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is This is an unnecessary format


```protobuf
syntax = "proto3";

service ElizaService {
rpc Say(SayRequest) returns (SayResponse) {}
}

message SayRequest {
string sentence = 1;
}

message SayResponse {
string sentence = 1;
}
```

You can see the full version including comments and some additional RPCs [on the Buf Schema Registry](https://buf.build/connectrpc/eliza/file/main:connectrpc/eliza/v1/eliza.proto) (BSR). The `rpc` keyword stands for Remote Procedure Call — a method you can invoke remotely. The schema is the contract between server and client, and it precisely defines how data is exchanged down to the very details of serialization.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Modify link


The schema comes to life by generating code. For the server, an interface is generated, and the developer can focus on filling the methods with business logic. For the client, there really isn't anything to do — the developer can just call the client methods, rely on the generated types for compile-time type-safety and serialization, and focus on the application logic.

## Generated SDKs

In the tutorial, we have been using [generated SDKs](https://buf.build/docs/bsr/generated-sdks/npm) with an `npm install` command. When the package was requested on the BSR NPM registry, it ran the schema through a code generator, and served the generated files as a package with all required dependencies.

If you want to use a Dubbo or gRPC service whose schema is published on the BSR, you can simply use `npm` to install the package, and hit the service with a Dubbo client.

See our [documentation on generated SDKs](https://buf.build/docs/bsr/generated-sdks/overview) for details.

## Local generation

We're going to generate our code using [Buf](https://buf.build/product/cli/), a modern replacement for Google's protobuf compiler, and two compiler plugins:

- [@apachedubbo/protoc-gen-apache-dubbo-es](https://www.npmjs.com/package/@apachedubbo/protoc-gen-apache-dubbo-es) — generates services from your Protocol Buffer schema
- [@bufbuild/protoc-gen-es](https://www.npmjs.com/package/@bufbuild/protoc-gen-es) — generates base types, like request and response messages

The code we will generate has three runtime dependencies:

- [@apachedubbo/dubbo](https://www.npmjs.com/package/@apachedubbo/dubbo) — provides clients, interceptors, errors, and other primitives for Dubbo
- [@apachedubbo/dubbo-web](https://www.npmjs.com/package/@apachedubbo/dubbo-web) — provides the Dubbo and gRPC-web protocols for web browsers
- [@bufbuild/protobuf](https://www.npmjs.com/package/@bufbuild/protobuf) — provides serialization and more for the base types

First, let's install `buf`, the plugins and runtime dependencies:

```bash
$ npm install --save-dev @bufbuild/buf @apachedubbo/protoc-gen-apache-dubbo-es @bufbuild/protoc-gen-es
$ npm install @apachedubbo/dubbo @apachedubbo/dubbo @bufbuild/protobuf
```

Next, tell Buf to use the two plugins with a new configuration file:

```yaml
# buf.gen.yaml defines a local generation template.
# For details, see https://buf.build/docs/configuration/v1/buf-gen-yaml
version: v1
plugins:
# This will invoke protoc-gen-es and write output to src/gen
- plugin: es
out: src/gen
opt:
# Add more plugin options here
- target=ts
# This will invoke protoc-gen-apache-dubbo-es
- plugin: apache-dubbo-es
out: src/gen
opt:
# Add more plugin options here
- target=ts
```

If desired, you can also skip local plugin installation and use [remote plugins](https://buf.build/docs/bsr/remote-plugins/overview).

Finally, tell Buf to generate code for the ELIZA schema:

```bash
$ npx buf generate buf.build/apache-dubbo/eliza
```

If you prefer, you can use `protoc` instead of Buf — the plugins behave like any other plugin.

### Output

Let's take a peek at what was generated. There are two new files:

- `src/gen/apache-dubbo/eliza/v1/eliza_dubbo.ts`
- `src/gen/apache-dubbo/eliza/v1/eliza_pb.ts`

The first file was generated by `protoc-gen-apache-dubbo-es` and contains the service:

```ts
import { SayRequest, SayResponse } from "./eliza_pb.js";
import { MethodKind } from "@bufbuild/protobuf";

export const ElizaService = {
typeName: "buf.connect.demo.eliza.v1.ElizaService",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to modify

methods: {
say: {
name: "Say",
I: SayRequest,
O: SayResponse,
kind: MethodKind.Unary,
},
}
} as const;
```

The full file includes comments and additional RPCs, but the `const` above really is all Dubbo needs to provide clients.

The second file was generated by `protoc-gen-es`, and contains the request and response classes. You can see them being imported for the service definition. To learn more about `protoc-gen-es`, head over to the documentation for the [Protobuf-ES project](https://github.com/bufbuild/protobuf-es/blob/main/docs/generated_code.md).

If your bundler does not handle the `.js` extension in the import `from "./eliza_pb.js"` correctly, you can configure it following our examples, or you can add the plugin option `import_extension=none` to remove the extension.

You can find the documentation for all available [plugin options on npmjs.com](https://www.npmjs.com/package/@apachedubbo/protoc-gen-apache-dubbo-es#plugin-options) .

### Using the local files

To use the locally generated files in the tutorial, update the import path:

```diff
- import { ElizaService } from "gen/eliza_dubboweb";
+ import { ElizaService } from "gen/eliza_dubbo";
```
163 changes: 163 additions & 0 deletions docs/guide/dubboForWEB/Interceptors.md
Original file line number Diff line number Diff line change
@@ -1 +1,164 @@
# Interceptors

An interceptor can add logic to clients, similar to the decorators or middleware you may have seen in other libraries. Interceptors may mutate the request and response, catch errors and retry/recover, emit logs, or do nearly anything else.

For a simple example, this interceptor logs all requests:

```ts
import { Interceptor } from "@apachedubbo/dubbo";
import { createDubboTransport } from "@apachedubbo/dubbo-web"

const logger: Interceptor = (next) => async (req) => {
console.log(`sending message to ${req.url}`);
return await next(req);
};

createDubboTransport({
baseUrl: "http://localhost:8080",
interceptors: [logger],
});
```
You can think of interceptors like a layered onion. A request initiated by a client goes through the outermost layer first. Each call to `next()` traverses to the next layer. In the center, the actual HTTP request is run by the transport. The response then comes back through all layers and is returned to the client. In the array of interceptors passed to the transport, the interceptor at the end of the array is applied first.

To intercept responses, we simply look at the return value of `next()`:

```ts
const logger: Interceptor = (next) => async (req) => {
console.log(`sending message to ${req.url}`);
const res = await next(req);
if (!res.stream) {
console.log("message:", res.message);
}
return res;
};
```

The `stream` property of the response tells us whether this is a streaming response. A streaming response has not fully arrived yet when we intercept it — we have to wrap it to see individual messages:

```ts
const logger: Interceptor = (next) => async (req) => {
const res = await next(req);
if (res.stream) {
// to intercept streaming response messages, we wrap
// the AsynchronousIterable with a generator function
return {
...res,
message: logEach(res.message),
};
}
return res;
};

async function* logEach(stream: AsyncIterable<any>) {
for await (const m of stream) {
console.log("message received", m);
yield m;
}
}
```

# Context values

Context values are a type safe way to pass arbitary values from the call site or from one interceptor to the next. You can use `createContextValues` function to create a new `ContextValues`. The `contextValues` call option can be used to provide a `ContextValues` instance for each request.

`ContextValues` has methods to set, get, and delete values. The keys are `ContextKey` objects:

## Context Keys

`ContextKey` is a type safe and collision free way to use context values. It is defined using `createContextKey` function which takes a default value and returns a `ContextKey` object. The default value is used when the context value is not set.

```ts
import { createContextKey } from "@apachedubbo/dubbo";

type User = { name: string };

const kUser = createContextKey<User>(
{ name: "Anonymous" }, // Default value
{
description: "Current user", // Description useful for debugging
},
);

export { kUser };
```

For values where a default doesn't make sense you can just modify the type:

```ts
import { createContextKey } from "@apachedubbo/dubbo";

type User = { name: string };

const kUser = createContextKey<User | undefined>(undefined, {
description: "Authenticated user",
});

export { kUser };
```

It is best to define context keys in a separate file and export them. This is better for code splitting and also avoids circular imports. This also helps in the case where the provider changes based on the environment.

## Example

Let's say you want to log the response body. But you don't want to do it for every request. You only want to do it from a specific component. You can use context values to achieve this.

First create a context key:

```ts
import { createContextKey } from "@apachedubbo/dubbo";

const kLogBody = createContextKey<boolean>(false, {
description: "Log request/response body",
});

export { kLogBody };
```

Then in your interceptor, check the context value:

```ts
import type { Interceptor } from "@apachedubbo/dubbo";
import { kLogBody } from "./log-body-context.js";

const logger: Interceptor = (next) => async (req) => {
console.log(`sending message to ${req.url}`);
const res = await next(req);
if (!res.stream && req.contextValues.get(kLogBody)) {
console.log("message:", res.message);
}
return res;
};
```

Then in your component, set the context value:

```ts
import { kLogBody } from "./log-body-context.js";
import { elizaClient } from "./eliza-client.js";

const res = elizaClient.say({ sentence: "Hey!" }, { contextValues: createContextValues().set(kLogBody, true) });
```

# Setting `fetch()` options

Another valuable use case for interceptors is customizing the Fetch API for individual requests by leveraging the `request.init` object.

For example, by default, Dubbo sets the Fetch option [redirect](https://developer.mozilla.org/en-US/docs/Web/API/fetch#redirect) to `error`, which means that a network error will be returned when a request is met with a redirect. However, if you wish to change this value to `follow` for example, you can do so using an interceptor.

```ts
const followRedirects: Interceptor = (next) => async (request) => {
return await next({
...request,
init: {
...request.init,
// Follow all redirects
redirect: "follow",
},
});
};

const client = createPromiseClient(ElizaService, createConnectTransport({
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not createConnectTransport

baseUrl: "http://localhost:8080",
interceptors: [logger],
}));
```
Loading
Loading