Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

discussion, plugin, pipboard: support image/table display when evaluating models #358

Open
WenheLI opened this issue Jul 20, 2020 · 8 comments
Labels
pipboard Pipboard plugin Pipcook plugin addition, bug report and changes

Comments

@WenheLI
Copy link
Collaborator

WenheLI commented Jul 20, 2020

Currently, pipboard can display numerical data(accuracy, recall rate, and losses).
However, most model evaluation process requires images (I.E. generated image in GAN model) or tables (I.E. Confusion Matrix in Classification Task) for better parameter tuning.

Therefore, I think it is necessary to support an image/table display both on the pipboard and eval plugin protocol.

Discussions on how to implement it and the protocol design are very welcome.

@WenheLI WenheLI added plugin Pipcook plugin addition, bug report and changes pipboard Pipboard labels Jul 20, 2020
@yorkie
Copy link
Member

yorkie commented Jul 20, 2020

@WenheLI our current evaluating result is a JSON string, which depends on the model-evaluate plugin. Before we start working on Pipboard, is there possible to have a protocol design on evaluateMaps?

@WenheLI
Copy link
Collaborator Author

WenheLI commented Jul 20, 2020

@yorkie - If we render those maps in the pipcook process, we can directly write images into the disk and refer them using URI. We just need to modify pipboard to support image display. However, this solution compromises the interactivity.

Another solution is to render the data in the frontend and in this case, we might need some data structures like:

{
    numericalData: Number
    renderedData: ArrayLike<Number>,
    type: 'ConfusionMatrix' | 'Image' | ...
}

This solution adds up complexity but gives flexibility when exploring data & model.

@yorkie
Copy link
Member

yorkie commented Jul 20, 2020

If we render those maps in the pipcook process, we can directly write images into the disk and refer them using URI. We just need to modify the pipboard to support the image display. However, this solution compromises the interactivity.

How does Pipboard know how to render the data? Does that mean we will have some hard code to read these URLs and display?

By the way, I like 2nd one though its complexity to implement.

@WenheLI
Copy link
Collaborator Author

WenheLI commented Jul 20, 2020

We need to tell pipboard how to render different types of maps using something like:

switch _type {
    case 'ConfusionMatrix':
        renderCM(renderedData)
        break;
    case 'Image':
        renderImage(renderedData)
        break;
    .......

}

@yorkie
Copy link
Member

yorkie commented Jul 20, 2020

I see, does this add a new protocol to tell model-evaluate plugin how to write evaluationMaps?

@WenheLI
Copy link
Collaborator Author

WenheLI commented Jul 20, 2020

This is the previous design:

export interface EvaluateResult {
  pass?: boolean;
  [key: string]: any;
}

Maybe we can change it into:

export interface EvaluateResult {
  pass?: boolean;
  requireRender: boolean;
  renderData?: ArrayLike<Number>, 
  [key: string]: any;
}

@yorkie
Copy link
Member

yorkie commented Jul 20, 2020

This definition just looks like an interface in pipcook-core is Pipboard-oriented, and it's not that good for a plugin developer to learn how to render in Pipboard. How about putting these types into the EvaluateResult?

And I think requireRender is no needed, the renderer will detect whether it could be rendered by itself :)

@yorkie
Copy link
Member

yorkie commented Jul 20, 2020

BTW it's useful to add accuracy, recall rate, and losses, too :p

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pipboard Pipboard plugin Pipcook plugin addition, bug report and changes
Projects
None yet
Development

No branches or pull requests

2 participants