Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pass to wrap StableHLO ops in composite #2722

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

sdasgup3
Copy link
Member

The pass wraps StableHLO ops, as specified by a pass option flag, in a composite op. The composite op will inherit all attributes of the original op.

For example, using the option flag --stablehlo-wrap-in-composite=op-names='stablehlo.add':

    func.func @add(%arg0: tensor<2xf32>, %arg1: tensor<2xf32>) -> tensor<2xf32> {
      %0 = "stablehlo.add"(%arg0, %arg1) : (tensor<2xf32>, tensor<2xf32>) -> tensor<2xf32>
      func.return %0 : tensor<2xf32>
    }

will become:

    func.func @add(%arg0: tensor<2xf32>, %arg1: tensor<2xf32>) -> tensor<2xf32> {
      %0 = stablehlo.composite "stablehlo.add" %arg0, %arg1 {
        decomposition = @stablehlo.add.impl,
      } : (tensor<2xf32>, tensor<2xf32>) -> tensor<2xf32>
      return %0 : tensor<2xf32>
    }

    func.func private @stablehlo.add.impl(%arg0: tensor<2xf32>, %arg1: tensor<2xf32>) -> tensor<2xf32> {
      %0 = "stablehlo.add"(%arg0, %arg1) : (tensor<2xf32>, tensor<2xf32>) -> tensor<2xf32>
      func.return %0 : tensor<2xf32>
    }

The pass is also exposed as an API createStablehloWrapInCompositePass to allow for more flexible selection of ops to wrap.
For example, the following will wrap all non-composite ops that are not stablehlo.add or stablehlo.convolution:

```c++
auto pass = createStablehloWrapInCompositePass(
   (Operation *op) {
      return (op->getName().getStringRef() == "stablehlo.add" ||
              op->getName().getStringRef() == "stablehlo.convolution") &&
             !isa<stablehlo::CompositeOp>(op);
    });
```

@sdasgup3 sdasgup3 added the Transformations Pertaining to MLIR passes and transformations label Feb 27, 2025
Copy link

Important

The terms of service for this installation has not been accepted. Please ask the Organization owners to visit the Gemini Code Assist Admin Console to sign it.

@sdasgup3 sdasgup3 marked this pull request as draft February 27, 2025 01:34

_Wraps a non-composite StableHLO op in a composite op._

Wraps StableHLO ops, as specified by the pass option flag, in a
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's talk about the design more. It's missing a few things to be generally useful:

  • What to do with ops that have attributes?
  • What should the composites be named, or should the name be user specified as a part of the callback fn, etc

Need to think more, maybe we have a little brainstorm and run through some examples

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All good points! SGTM.

What to do with ops that have attributes?

All the op attributes are preserved as composite attributes. To demonstrate that I have used convolution op as an example in stablehlo/tests/transforms/stablehlo_wrap_in_composite.mlir.

@sdasgup3 sdasgup3 marked this pull request as ready for review February 27, 2025 09:21
func.return %2 : tensor<64x3x3x32xi32>
}

// -----
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have an example with regions like reduce?
Should this pass run as part of a pass pipeline? If we run a decomposer pass that introduces stablehlo.convolution, they also need to get wrapped right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added.

Should this pass run as part of a pass pipeline?

Yes, that is the more practical usage of the pass.

If we run a decomposer pass that introduces stablehlo.convolution, they also need to get wrapped right?

If we have a decomposer pass and it emit a convolution and if the user intend to wrap that conv into a composite, then yes it will be wrapped.
Let me know if that address your concern.

@sdasgup3 sdasgup3 marked this pull request as draft February 28, 2025 01:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Transformations Pertaining to MLIR passes and transformations
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants