Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Demonstrating Native Addons #5

Merged
merged 7 commits into from
May 23, 2022
Merged

Demonstrating Native Addons #5

merged 7 commits into from
May 23, 2022

Conversation

CMCDragonkai
Copy link
Member

@CMCDragonkai CMCDragonkai commented May 22, 2022

Description

Derived from #2

In helping solve the snapshot isolation problem in MatrixAI/js-db#18, we needed to lift the hood and go into the C++ level of nodejs.

To do this, I need to have a demonstration of how native addons can be done in our demo lib here.

There are 2 ecosystems for building native addons:

  • prebuild
  • node-pre-gyp

Of the 2, the prebuild ecosystem is used by UTP and leveldb. So we will continue using that. Advantages from 2016 was commented here: prebuild/prebuild#159

The basic idea is that Node supports a "NAPI" system that enables node applications to call into C++. So it's a the FFI system of NodeJS. It's also a bidirectional FFI as C++ code can call back into the NodeJS JS functions.

The core library is node-gyp. In the prebuild ecosystem is wrapped with node-gyp-build, which you'll notice is the one that we already using in this repo. The main feature here is the ability to supply prebuilt binaries instead of expecting the end-user to always compile from source.

Further details here: https://nodejs.github.io/node-addon-examples/build-tools/prebuild (it also compares it to node-pre-gyp).

The node-gyp-build has to be a dependency, not devDependencies, because it is used during runtime to automatically find the built shared-object/dynamic library and to load it.

It looks like this:

import nodeGypBuild from 'node-gyp-build';
const bindings = nodeGypBuild('./path/to/dir/containing/gyp/file');
bindings.someNativeFunction()

Internally nodeGypBuild ends up calling the require() function inside NodeJS. Which supports the ability to load *.node binaries (which is the shared-object that is compiled using the NAPI C++ headers). See: https://github.com/prebuild/node-gyp-build/blob/2e982977240368f8baed3975a0f3b048999af40e/index.js#L6

The require is supplied by the NodeJS runtime. If you execute the JS with a different runtime, they may support the commonjs standard, and thus understand the require calls, but they may be compatible with native modules that are compiled with NAPI headers. This is relevant since, you also have to load the binary that matches your OS libraries and CPU architecture. It's all dynamic linking under the hood. This is also why you use node-gyp-build which automates some of this lookup procedure.

As a side-note about bundlers. Bundlers are often used part of the build process that targets web-platforms. Since the web platform does not understand require calls, bundlers will perform some sort of transclusion. This is also the case when ES6 import targets files on disk. Details on this process is here: https://github.com/evanw/esbuild/blob/master/docs/architecture.md#notes-about-linking. Bundlers will often call this "linking", and when targetting web-platforms, this is basically a form of static linking since JS running in browsers cannot load JS files from disk. This is also why in some cases, one should replace native addons with WASM instead, as bundlers can support static linking of WASM (which are cross-platform) into a web-bundle. But some native addons depend on OS features (like databases with persistence), and fundamentally cannot be converted into WASM binaries. In the future, our crypto code would make sense to turn into WASM binaries. But DB code is likely to always be native, as they have to be persistent. As the web develops can gains extra features, then eventually it may be possible that all native code can be done via WASM (but this may be a few years off).

Now the native module itself is just done with a C++ file like index.cpp. We should prefer using .cpp and .h as the most portable extensions.

Additionally, there must be binding.gyp file that looks like this:

{
  "targets": [{
    "target_name": "somename",
    "include_dirs": [
      "<!(node -e \"require('napi-macros')\")"
    ],
    "sources": [ "./index.cpp" ]
  }]
}

Basically another configuration file that configures node-gyp and how it should be compiling the C++ code. The target_name specifies the name of the addon file, so the output result will be somename.node. The sources are self-explanatory. The include_dirs entries have the ability to execute shell commands, in this case, it is using node -e to execute a script that will return some string that is a path to C++ headers that will be included during compilation.

The C++ code needs to use the NAPI headers, however there's a macro library that makes writing NAPI addons easier: https://github.com/hyperdivision/napi-macros. I've seen this used in the utp-native and classic-level.

The C++ code may look like this:

#include <node_api.h>
#include <napi-macros.h>

NAPI_METHOD(times_two) {
  NAPI_ARGV(1)
  NAPI_ARGV_INT32(number, 0)

  number *= 2;

  NAPI_RETURN_INT32(number)
}

NAPI_INIT() {
  NAPI_EXPORT_FUNCTION(times_two)
}

This ends up exporting a native module containing the times_two function that multiples a number by 2, and returns an int32 number.

It's also important that node-gyp-build is setup as a install script in the package.json:

  "scripts": {
    "install": "node-gyp-build"
  }

This means when you run npm install (which is used to install all the dependencies for a NPM package, or to install a specific NPM package), it will run the node-gyp-build durin the installation process.

This means that currently in our utils.nix node2nixDev expression still requires the npm install command. This used to exist, however I removed it during MatrixAI/TypeScript-Demo-Lib#37 thinking it had no effect. But it was confirmed by svanderburg/node2nix#293 (comment) that the npm install command is still run in order to execute build scripts. And node-gyp-build is now part of the installation process. We should include: https://github.com/svanderburg/node2nix/blob/8264147f506dd2964f7ae615dea65bd13c73c0d0/nix/node-env.nix#L380-L387 with all the necessary flags and parameters too. We may be able to make it work if we hook our build command prior to npm install. I imagine that this should be possible since the npm rebuild command is executed prior. So we need to investigate this.

In order to make this all work, our Nix environment is going to need all the tools for source compilation. Now according to https://github.com/nodejs/node-gyp#on-unix we will need python3, make and gcc. Our shell.nix naturally has make and gcc because we are using pkgs.mkShell which must extend from stdenv.mkDerivation. However python3 will be needed as well.

The node2nix has some understanding of native dependencies (this is why it also brings in python in its generated derivation svanderburg/node2nix#281), and I believe it doesn't actually build from source (except in some overridden dependencies).

Some npm dependencies are brought in via nixpkgs nodePackages because node2nix derivation isn't enough to build them (because they have complex native dependencies). Such as node-gyp-build itself or vercel's pkg. This is also why I had to provide nodePackages.node-gyp-build in our buildInputs overrides in utils.nix. It is important that any dependencies acquired via nixpkgs must be the same version we use in our package.json. And this is the case for:

    "node-gyp-build": "4.4.0"
    "pkg": "5.6.0",

Ideally we won't need to do this our own native packages if js-db ends up forking classic-level or leveldown. I think this trick is only relevant in our "build tools" and not our runtime dependencies.

The remaining problem is cross-compilation, as this only enables building from source if you are on NixOS and/or using Nix. Windows and MacOS will require their own setup. Since our development environment is all Nix focused, we don't have to worry about those, but for end-users who may want to rebuild from scratch, they will need to setup their development environent based on information in https://github.com/nodejs/node-gyp. A more pressing question is how we in our Nix development environment will be capable of cross-platform native addons for distribution.

This is where the prebuild ecosystem comes in and in particular https://github.com/prebuild/prebuildify-cross. This is used in leveldb to enable them to build for different platforms, and then save these cross-compiled objects. These objects are then hosted on GitHub releases, and automatically downloaded upon installation for downstream users. In the case they are not downloadable, they are then built from source. https://github.com/Level/classic-level/blob/f4cabe9e6532a876f6b6c2412a94e8c10dc5641a/package.json#L21-L26

However in our Nix based environment, I wonder if we can avoid using docker to do cross compilation, and instead use Nix to provide all the tooling to do cross-compilation. We'll see how this plays out eventually.

Some additional convenience commands now:

# install the current package and install all its dependencies, and build them ALL from source
npm install --build-from-source
# install a specific dependency and build it from source
npm install classic-level --build-from-source
# runs npm build on current package and all dependencies, and also recompiles all C++ addons
npm rebuild
# runs npm build on current package and all dependencies, and specifically recompiles sqlite3 package which has C++ addon
npm rebuild --build-from-source=sqlite3

Issues Fixed

Tasks

  • 1. Integrate node-gyp-build
  • 2. Create a native module exporting a demo functions like addOne for primitives and setProperty for reference-passing procedure and makeArray for heap allocation
  • 3. Fix the nix expressions to support node-gyp-build and other build scripts, and see if we can eliminate our postInstall hook, by relying on package.json hooks instead
  • 4. Integrate prebuildify to precompile binaries and host them on our git release... but this depends on whether typescript-demo-lib is used as a library or as an application, if used as an application, then the pkg builds is used, if used as a library, then one must install the native binary from the same github release, this means the native binary must be part of the same release page.
    • The pkg integration may just be a matter of setting the assets path in package.json to the local prebuilds directory.
    • See the scripts that other projects used WIP: Demonstrating Native Addons TypeScript-Demo-Lib#38 (comment)
    • Ensure that compiled native addons in nix have their rpath removed, because nodejs addons shouldn't have an rpath set, and this enables them to be portable
  • [ ] 5. Cross compilation, prebuildify-cross or something else that uses Nix - we must use CI/CD to do cross compilation (not sure about other architectures like ARM)
  • 6. Update the @typescript-eslint packages to match js-db to avoid the warning message.
  • 7. Add typescript typings to a native module
  • [ ] 8. Update README.md to indicate the 2 branches of typescript-demo-lib, the main and the native branch, where the native branch indicates how to build native addons - this will be done in a separate repo: https://github.com/MatrixAI/TypeScript-Demo-Lib-Native based off https://gitlab.com/MatrixAI/Employees/matrix-team/-/issues/8#note_885403611
  • 9. Migrate changes to https://github.com/MatrixAI/TypeScript-Demo-Lib-Native and request mac access to it the repository on gitlab. This branch will just be for development first. The changes here are too significant to keep within the same template repository.
  • 10. The pkg bundle can receive optimisation on which prebuild architectures it bundles, right now it bundles all architectures, when the target architecture implies only a single architecture is required. This can slim the final output pkg so it's not storing random unnecessary things. This may mean that pkg requires dynamic --config to be generated.
  • 11. See if nix-build ./release.nix -A application can be use prebuilds/ directory as well, as this can unify with pkg. That way all things can use prebuilds/ directory. But we would want to optimise it with task 10.
  • [ ] 12. Ensure that npm test can automatically run general tests, and platform-specific tests if detected on the relevant platform - this can be done in polykey as a script
  • 13. Automatic npm publish for prerelease and release based on staging and master branches, add these to the CI/CD jobs
  • 14. Ensure that integration CI/CD jobs are passing by running the final executable with all the bundled prebuilt binaries

Future Tasks

Final checklist

  • Domain specific tests
  • Full tests
  • Updated inline-comment documentation
  • Lint fixed
  • Squash and rebased
  • Sanity check the final build

@CMCDragonkai CMCDragonkai changed the title Feature native Demonstrating Native Addons May 22, 2022
@ghost
Copy link

ghost commented May 22, 2022

👆 Click on the image for a new way to code review
  • Make big changes easier — review code in small groups of related files

  • Know where to start — see the whole change at a glance

  • Take a code tour — explore the change with an interactive tour

  • Make comments and review — all fully sync’ed with github

    Try it now!

Review these changes using an interactive CodeSee Map

Legend

CodeSee Map Legend

@CMCDragonkai
Copy link
Member Author

Now it's back to tag pipeline, but we put in the necessary checks to prevent running when the commit title is a version release tag.

@CMCDragonkai
Copy link
Member Author

The build:prerelease is now working. Some prereleases were pushed to NPM under typescript-demo-lib. Changed everything now to point to typescript-demo-lib-native except executable name.

@CMCDragonkai
Copy link
Member Author

Now that prerelease build is working fine.

Final step is release job which is done after integration.

We will use our original job which produces a GH release and tag. But also combine it with a release job integration:release that does the same thing as build:prerelease. Maybe it should be called release:npm.

In fact our build:prerelease could be called release:npm-prerelease as the staging is just for grouping.

@CMCDragonkai
Copy link
Member Author

Also we'd like to auto-merge the staging into master when this all passes. This should be done after all the integration runs are done.

This will require the gh command to merge branches and push back up to our main repo.

@CMCDragonkai
Copy link
Member Author

CMCDragonkai commented May 22, 2022

Actually automerge via gh is not possible. It only does it via PRs. That would mean there would be a PR from staging to master.

So another way is just to use git to merge into master in the job and then to do a git push on the master branch. This would be enough to indicate that the master is up to date to staging.

There's also push options that enable auto-creating a MR on gitlab... or using gh to auto create a PR on GH that merges staging into master. But this may end up creating duplicate PRs.

So keep it simple and just merge into master, and push it up. That will trigger the pipeline on master branch. Like a recursive pipeline.

What are the permissions needed to write back to the repo.

@CMCDragonkai
Copy link
Member Author

For the push back to origin, we may need to authenticate either via SSH or HTTP. I believe HTTP would be the best. Some sort of token needs to be available for the projects that authenticate themselves to the same project.

@CMCDragonkai
Copy link
Member Author

I found https://docs.gitlab.com/ee/ci/jobs/ci_job_token.html but it's a bit more complicated than that. Firstly I'm not sure if it can even push up to the origin.

Alternatively access token exists on each project, but that has to be setup individually for each project.

Finally all of this would only affect the gitlab repository which is a pull mirror of github.

So the actual auth needed is github. Which is why we wanted to use gh.

@CMCDragonkai
Copy link
Member Author

CMCDragonkai commented May 22, 2022

I think we might need to use gh to just create the PR if it doesn't already exist, and then proceed to merge that PR if it does exist. If that can be done in command, that would be great.

We now have a nice way of merging staging to master.

Note that further jobs run after merging to master. In this case production deployment, production deployment tests, then final production release.

@CMCDragonkai
Copy link
Member Author

Some final issues:

  1. Pre-release/release should only run on tag pipeline, but not the commit associated with the tag to avoid duplicating the work.
  2. Deployment jobs should run on every commit regardless of the tags.
  3. Merging from staging to master should run on every commit regardless of the tags.

@CMCDragonkai
Copy link
Member Author

To test merging from feature-native to staging. I need to use git merge feature-native --no-ff . By default git will do fast forward merges.

Generally speaking we would want feature branches to fast forward into staging branch, and this is enforced by GitHub.

According to https://stackoverflow.com/questions/60597400/how-to-do-a-fast-forward-merge-on-github, GitHub has no way of doing fast forward merges on PRs.

This means:

  1. PRs from github always results in merge commits
  2. Using git merge ... will not result in merge commit, beware of this, in case you want it to exist (makes it easier to revert merges)
  3. Because we aren't able to merge directly in the CI/CD without getting a "key" or token that has write access to the repo, we have to use gh to create a PR that auto merges
  4. This means merges from staging to master won't be linear, and will end up with a "merge commit" on master that doesn't exist on staging

The original plan was to have linear commits between master and staging, but this is also acceptable for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

1 participant