Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

custom message renderer or wrapper #35

Closed
Laurian opened this issue Feb 25, 2024 · 18 comments
Closed

custom message renderer or wrapper #35

Laurian opened this issue Feb 25, 2024 · 18 comments
Assignees
Labels
enhancement New feature or request

Comments

@Laurian
Copy link

Laurian commented Feb 25, 2024

Several chat UIs out there have extra widgets along a message like thumbs up / down, share, report, etc.

It would be handy to have a way to decorate the message box with interactive components.
I think there are 3 ways to approach this:

  1. custom components to render messages

  2. custom component to wrap the message (keep current rendering)

  3. injecting a widget/markup in the current nluxc-text-message-content div element

  4. In Discord @salmenus mentioned something along passing a custom rendered:

<AiChat messageRenderer={MyCustomMessageRenderer} />
type CustomMessageRenderer = (message: string, extras: OtherMessageRelatedInfo): ReactElement

But I guess this won't work in stream mode, I mean it will work as in fetch mode unless we make the component also deal with the streaming adapter.

  1. Alternatively a wrapper would just decorate the message which is rendered by the original core component (<AiChat /> renders <MyCustomMessageWrapper message={message} extras={extras}>{children}</My…>) where {children} is the original core component that does handle streaming, markdown, etc. And I guess while streaming the message and extras props will get updated, maybe a complete prop (fed from the streaming adapter observer?) would be needed too to know that now I can interact with my decorations?

  2. Like 2, let's keep the original rendering but instead of wrapping it just add some custom component in React or markup in js:

<AiChat messageHeader={MyCustomMessageHeader} messageFooter={MyCustomMessageFooter} />

But even injecting plain markup would be enough with React, having messageHeader={<div className="header"></div>} would allow me to render what I need inside with createPortal.

It would be nice if the customisation could be done in a way to works at core component level, such that could be used in both js and react.

@salmenus salmenus self-assigned this Mar 6, 2024
@salmenus salmenus added the enhancement New feature or request label Mar 6, 2024
@salmenus
Copy link
Member

salmenus commented Mar 6, 2024

Thanks @Laurian for raising this, and for the well detailed suggestion.

It is indeed something that I've been thinking about - as that message rendering component is quite important and responsible for a big part of the interaction with the LLM.

I like your suggestion of a custom component to wrap the message. The message rendering is a bit complex as that's where we handle rendering markdown as it's being generated. But we can probably have a custom message renderer in React as following:

<AiChat messageRenderer={MyCustomMessageRenderer} />

With MyCustomMessageRenderer defined as following:

import {useChatResponseRenderer} from '@nlux/react';

const CustomMessageRenderer = (observer: IObserver<DataType>, extras: OtherMessageRelatedInfo): ReactElement => {
    const [ResponseRenderer, status] = useChatRenderer(observer);
    return (
        <div>
            <div>Some custom stuff!</div>
            <ResponseRenderer />
            {(status === 'loading') && <div>Stuff to show when loading!</div>}
            {(status === 'complete') && <div>Stuff to show when message is rendered!</div>}
        </div>
    );
}
  • The ChatResponseRenderer component will be aware of what's being generate and will render it as it should expected, use the appropriate rendering config.
  • The status property allows developers to device what other content to render based on status.

What do you think?

I'm moving this feature request into the Features Roadmap and I'll be prioritising it.

@TechWithTy
Copy link

@salmenus Were working on this feature and we need the unminified nlux-core.js To be able to interact with the message component effectively

@salmenus
Copy link
Member

@TechWithTy are you using React or Vanilla JS ?

It's fairly easy with core Vanilla JS. but requires more work with React.

@TechWithTy
Copy link

We are using ReactJS/Typescript And we are trying to modify this file but it is minified unfortunately
nlux-core.txt

@nehal7-ml
Copy link

nehal7-ml commented Mar 20, 2024

@salmenus , since we're using ReactJS/Typescript and attempting to enhance our chat functionality, we're encountering challenges with modifying the minified nlux-core.js file. As our application primarily relies on React, we're seeking guidance on how to approach this task effectively.
We aim to add buttons for reactions and options to regenerate within our chat reply containers. Given that our application is React-based, how can we achieve this without directly manipulating the DOM, which is not allowed?
Any insights or suggestions on how we could tackle this within the React ecosystem would be greatly appreciated. Thanks! #chatgpt-feature

@salmenus
Copy link
Member

Ok. Given the numbers of requests, I'm prioritising this issue.

Moving it to In Progress.

It's going to be similar to the JSX for user personas: You will be able to provide your own react component that can handle the rendering of the chat message.

You'll have it ready in the next 48 hours.

@TechWithTy
Copy link

Awesome @salmenus

@salmenus
Copy link
Member

Still WIP. I had to refactor the whole part of the library related to DOM rendering and update for a seemless support of JS and React. I'm aiming to publish a new version tomorrow with new rendering and support for custom messages.

I'll keep you posted via this thread.

@nehal7-ml
Copy link

Thank you, @salmenus

@salmenus
Copy link
Member

salmenus commented Mar 27, 2024

Still work-in-progress. NPM not published yet.

Progress Update:

  • New core DOM components for different sections of the AI chat UI
  • New React components for different sections of the AI chat UI
  • Support for custom component for message rendering in React (main feature request)
  • Unit tests for the new components
  • Styling of the new components
  • Integration with existing optimisations (markdown streaming, scroll-to-bottom when generating, etc)
  • Documentation and code examples

I'll keep you posted via this thread once new NPM is published.
Code change here.

@TechWithTy
Copy link

Awesome @salmenus Thank you!

@salmenus
Copy link
Member

salmenus commented Apr 8, 2024

This clearly took more than 48 hours!
I'm pushing changes to this PR and aiming to merge later this week.

I had to do an entire refactoring of the view/UI layer of both React and JS ports of the library.
It's time consuming and took longer than expected, but I think it very beneficial for the long run.

@chip-davis
Copy link

Awesome can't wait! Is there any low hanging fruit we could help with?

@salmenus
Copy link
Member

PR merged into main branch with new React implementation. RC following.

@TechWithTy
Copy link

Lets goo!

@salmenus
Copy link
Member

salmenus commented May 15, 2024

Custom renderers are finally here ! 🎉
Along with a full re-write of NLUX React layer ✔️

This has just been released as part of 2.1.0-beta
This is a major release with several changes to config options, so expect some breaking changes if you're using v1.x.

You can give the feature a try in this code sandbox:
https://codesandbox.io/p/sandbox/wild-rose-84wyvw?file=%2Fsrc%2FApp.tsx

The custom components provided as part of responseComponent and supports both streaming and fetch modes.
For fetch mode, you will also get the full JSON returned from the server.

This major code change and React re-write will enable several other features (RSC, component streaming, etc).
And, similar to other parts of the NLUX code base, it's a high quality high perf code change covered with 600+ unit tests.

We're currently working on updating docs and improving theming (the last piece of this major code change).
Meanwhile, you can look at Typescript type definitions or code base to figure out option values.

--

In action, from the code sandbox example linked above:

const MyCustomResponseRenderer: ResponseRenderer<string> = (
  props: FetchResponseComponentProps<string> | StreamResponseComponentProps<string>
) => {
  console.log("Data fetched from LangServe!");
  console.dir(props);

  const propsForFetch = props as FetchResponseComponentProps<string>;
  const propsForStream = props as StreamResponseComponentProps<string>;
  const dataTransferMode = props.dataTransferMode as any;

  return (
    <>
      {dataTransferMode === "fetch" && <div>{propsForFetch.content}</div>}
      {dataTransferMode === "stream" && (<div ref={propsForStream.containerRef} />)}
      <div>Footer Custom Response Component</div>
    </>
  );
};
  <AiChat
    adapter={adapter}
    messageOptions={{
      responseComponent: MyCustomResponseRenderer
    }}
  />

@salmenus salmenus pinned this issue May 15, 2024
@salmenus salmenus unpinned this issue May 15, 2024
@salmenus salmenus pinned this issue May 15, 2024
@salmenus
Copy link
Member

salmenus commented May 28, 2024

Reference doc on custom adapters is now available here:
https://docs.nlkit.com/nlux/reference/ui/custom-renderers

@salmenus
Copy link
Member

Examples on docs website with both streamed and batched custom adapters here:
https://docs.nlkit.com/nlux/examples/custom-response-renderers

Also — NLUX v2 is now released 🎉
With this features included ✅

@salmenus salmenus unpinned this issue May 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Done
Development

No branches or pull requests

5 participants