Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Editorial: Various style and wording tweaks #797

Merged
merged 13 commits into from
Jan 31, 2025

Conversation

inexorabletash
Copy link
Member

@inexorabletash inexorabletash commented Dec 7, 2024

This PR addresses the changes outlined in #783

To make reviewing easier I've left it as a chain of commits, each one tackling just one stylistic change.

Resolves #783


Preview | Diff

@@ -4040,11 +4040,11 @@ partial dictionary MLOpSupportLimits {
<dl dfn-type=dict-member dfn-for=MLGruOptions>
: <dfn>bias</dfn>
::
The 2-D input bias tensor of shape *[numDirections, 3 * hiddenSize]*. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to the {{MLGruOptions/layout}} argument.
The 2-D input bias tensor of shape *[numDirections, 3 * hiddenSize]*. The ordering of the bias vectors in the second dimension of the tensor shape is specified according to {{MLGruOptions/layout}}.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we format each numDirections and hiddenSize separately? Like [*numDirections*, 3 * *hiddenSize*], same for other occurrences.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now the coding conventions say When concisely defining a tensor's layout, use the syntax *[ ... ]* which we'd need to revise. More feedback wanted!

index.bs Outdated Show resolved Hide resolved
index.bs Outdated Show resolved Hide resolved
index.bs Outdated
@@ -2495,7 +2495,7 @@ partial dictionary MLOpSupportLimits {
</dl>

<div class="note">
A *depthwise* conv2d operation is a variant of grouped convolution, used in models like the MobileNet, where the *options.groups* = inputChannels = outputChannels and the shape of filter tensor is *[options.groups, 1, height, width]*
A *depthwise* conv2d operation is a variant of grouped convolution, used in models like the MobileNet, where the *options*.{{MLConv2dOptions/groups}} = inputChannels = outputChannels and the shape of filter tensor is *[options.groups, 1, height, width]*
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should inputChannels and outputChannels be formatted? In *[options.groups, 1, height, width]*, should we use [*options*.{{MLConv2dOptions/groups}}, 1, *height*, *width*]?

I found the prose of MLConv2dOptions.inputLayout and MLConv2dOptions.filterLayout also refer to variables including inputChannels and outputChannels etc.,. Should we also format them separately?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Formatted *inputChannels* and *outputChannels* in that instance (in f714fc1) . As noted above, need to decide more generally on how to format shapes.

Copy link
Contributor

@huningxin huningxin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Copy link
Collaborator

@fdwr fdwr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One confirmation, else LGTM. TY Josh.

docs/SpecCodingConventions.md Outdated Show resolved Hide resolved
docs/SpecCodingConventions.md Show resolved Hide resolved
@inexorabletash
Copy link
Member Author

I think this is good to merge now, @fdwr ?

Copy link
Collaborator

@fdwr fdwr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is good to merge now, @fdwr ?

Yep, mergeable sir. 🫡

@fdwr fdwr merged commit 74b5ae5 into webmachinelearning:main Jan 31, 2025
2 checks passed
github-actions bot added a commit that referenced this pull request Jan 31, 2025
SHA: 74b5ae5
Reason: push, by fdwr

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Proposed style/wording tweaks
3 participants