This is a Community Fork of the nupic.core C++ repository, with Python bindings. This implements the theory as described in Numenta's BAMI.
- Actively developed C++ core library (Numenta's NuPIC repos are in maintenance mode only)
- Clean, lean, optimized, and modern codebase
- Stable and well tested code
- Open and easier involvement of new ideas across HTM community (it's fun to contribute, we make master run stable, but are more open to experiments and larger revamps of the code if it proves useful).
- Interfaces to other programming languages, currently C++ and Python
- Implemented in C++11 through C++17
- Static and shared lib files for use with C++ applications.
- Interfaces to Python 3 and Python 2.7 (Only Python 3 under Windows)
- Cross Platform Support for Windows, Linux, OSX and ARM64
- Easy installation. Many fewer dependencies than nupic.core, all are handled by CMake
- Significant speed optimizations
- Simplified codebase
- Removed CapnProto serialization. It was pervasive and complicated the code considerably. It was replaced with simple binary streaming serialization in C++ library.
- Removed sparse matrix libraries, use the optimized Connections class instead
- New and Improved Algorithms
- Revamped all algorithms APIs, making it easier for developers & researchers to use our codebase
- Sparse Distributed Representation class, integration, and tools for working with them
- API-compatibility with Numenta's code.
An objective is to stay close to the Nupic API Docs.
This is a priority for the
NetworkAPI
. The algorithms APIs on the other hand have deviated from their original API (but their logic is the same as Numenta's). If you are porting your code to this codebase, please follow the API Differences and consult the API Changelog. - The 'NetworkAPI' as originally defined by the NuPIC library includes a set of build-in Regions. These are described in NetworkAPI docs
- REST interface for
NetworkAPI
with a REST server.
If you want to use htm.core
from Python, the easiest method is to install from PyPI
- Note: to install from
pip
you'll need Python 3.7+
python -m pip install -i https://test.pypi.org/simple/ htm.core
Note: to run all examples with visualizations, install including extra requirements:
pip install -i https://test.pypi.org/simple/ htm.core[examples]
If you intend to use htm.core
as a library that provides you Python & C++ HTM,
you can use our binary releases.
For running C++ apps/examples/tests from binary release: none. If you want to use python, then obviously:
-
- Standard Python 3.7+ (Recommended)
- Standard Python 2.7
- We recommend the latest version of 2.7 where possible, but the system version should be fine.
- Python 2 is Not Supported on Windows, use Python 3 instead.
- Python 2 is not tested by our CI anomore. It may still work but we don't test it. We expect to drop support for Python2 around 2020.
- Anaconda Python 3.7+
- On windows you must run from within 'Anaconda Prompt' not 'Command Prompt'.
- The pre-built binary releases only work with Standard Python so you must build from sources.
- Anaconda Python is not tested in our CI.
Be sure that your Python executable is in the Path environment variable. The Python that is in your default path is the one that will determine which version of Python the extension library will be built for.
- Other implementations of Python may not work.
- Only the standard python from python.org have been tested.
An advantage of HTM.core
is its well tested self-sustained dependency install, so you can install
HTM on almost any platform/system if installing from source.
Fork or download the HTM-Community htm.core repository from https://github.com/htm-community/htm.core.
To fork the repo with git
:
git clone https://github.com/htm-community/htm.core
- same as for Binary releases, plus:
- C++ compiler: c++11/17 compatible (ie. g++, clang++).
- boost library (if not a C++17 compiler that supports filesystem.)
Note: Windows MSVC 2019 runs as C++17 by default. On linux use -std=c++17 compile option to avoid needing boost.
-
At a command prompt,
cd
to the root directory of this repository. -
Run:
python setup.py install --user --force
This will build and install a release version of htm.core. The
--user
option prevents the system installed site-packages folder from being changed and avoids the need for admin privileges. The--force
option forces the package to be replaced if it already exists from a previous build. Alternatively you can typepip uninstall htm.core
to remove a previous package before performing a build.-
If you are using
virtualenv
you do not need the --user or --force options. -
If you are using Anaconda Python you must run within the
Anaconda Prompt
on Windows. Do not use --user or --force options. -
If you run into problems due to caching of arguments in CMake, delete the folder
<path-to-repo>/build
and try again. This may be only an issue when developing C++ code.
-
-
After that completes you are ready to import the library:
python.exe >>> import htm # Python Library >>> import htm.bindings # C++ Extensions >>> help( htm ) # Documentation
You can run the unit tests with
python setup.py test
After cloning/downloading the repository, do the following:
cd path-to-repository
mkdir -p build/scripts
cd build/scripts
cmake ../..
make -j8 install
Build Artifact | File Location |
---|---|
Static Library | build/Release/lib/libhtm-core.a |
Shared Library | build/Release/lib/libhtm-core.so |
Header Files | build/Release/include/ |
Unit Tests | build/Release/bin/unit_tests |
Hotgym Dataset Example | build/Release/bin/benchmark_hotgym |
MNIST Dataset Example | build/Release/bin/mnist_sp |
REST Server Example | build/Release/bin/rest_server |
REST Client Example | build/Release/bin/rest_client |
-
A debug library can be created by adding
-DCMAKE_BUILD_TYPE=Debug
to the cmake command above.- The debug library will be put in
build/Debug
rather thanbuild/Release
. Use the cmake option-DCMAKE_INSTALL_PREFIX=../Release
to change this.
- The debug library will be put in
-
The -j option can be used with the
make install
command to compile with multiple threads. -
This will not build the Python interface. Use the Python build described above to build and install the python interface.
Here is an example of a Release build of your own C++ app that links to htm.core as a shared library.
#! /bin/sh
# Using GCC on linux ...
# First build htm.core from sources.
# cd <path-to-repo>
# mkdir -p build/scripts
# cd build/scripts
# cmake ../..
# make -j4 install
#
# Now build myapp
# We use -std=c++17 to get <filesystem> so we can avoid using the boost library.
# The -I gives the path to the includes needed to use with the htm.core library.
# The -L gives the path to the shared htm.core library location at build time.
# The LD_LIBRARY_PATH envirment variable points to the htm.core library location at runtime.
g++ -o myapp -std=c++17 -I <path-to-repo>/build/Release/include myapp.cpp -L <path-to-repo>/build/Release/lib -lhtm_core -lpthread -ldl
# Run myapp
export LD_LIBRARY_PATH=<path-to-repo>/build/Release/lib:$LD_LIBRARY_PATH
./myapp
Here is an example of a Debug build of your own C++ app that links to htm.core as a shared library.
#! /bin/sh
# Using GCC on linux ...
# First build htm.core as debug from sources.
# cd <path-to-repo>
# mkdir -p build/scripts
# cd build/scripts
# cmake ../.. -DCMAKE_BUILD_TYPE=Debug
# make -j4 install
#
# Now build myapp
# The -g -Og tells the compiler to build debug mode with no optimize.
# We use -std=c++17 to get <filesystem> so we can avoid using the boost library.
# The -D_GLIBCXX_DEBUG setting tell compiler to compile std:: with debug
# The -I gives the path to the includes needed to use with the htm.core library.
# The -L gives the path to the shared htm.core library location at build time.
# The LD_LIBRARY_PATH envirment variable points to the htm.core library location at runtime.
g++ -g -Og -o myapp -std=c++17 -D_GLIBCXX_DEBUG -I <path-to-repo>/build/Debug/include myapp.cpp -L <path-to-repo>/build/Debug/lib -lhtm_core -lpthread -ldl
# Run myapp in the debugger
export LD_LIBRARY_PATH=<path-to-repo>/build/Debug/lib:$LD_LIBRARY_PATH
gdb ./myapp
Our Dockerfile allows easy (cross) compilation from/to many HW platforms. This docker file does the full build, test & package build. It takes quite a while to complete.
If you are on amd64
(x86_64
) and would like to build a Docker image:
docker build --build-arg arch=amd64 .
If you are on ARM64
and would like to build a Docker image, run the command
below. The CI automated ARM64 build (detailed below) uses this
specifically.
docker build --build-arg arch=arm64 .
Note:
- If you're directly on ARM64/aarch64 (running on real HW) you don't need the docker image, and can use the standard binary/source installation procedure.
A bit tricky part is providing cross-compilation builds if you need to build for a different platform (aarch64) then your system is running (x86_64). A typical case is CI where all the standard(free) solutions offer only x86_64 systems, but we want to build for ARM.
See our ARM release workflow.
When running locally run:
docker run --privileged --rm multiarch/qemu-user-static:register
docker build -t htm-arm64-docker --build-arg arch=arm64 -f Dockerfile-pypi .
docker run htm-arm64-docker uname -a
docker run htm-arm64-docker python setup.py test
Note:
- the 1st line allows you to emulate another platform on your HW.
- 2nd line builds the docker image. The Dockerfile is a lightweight Alpine_arm64 image, which does full build,test&package build. It can take quite a long time. The Dockerfile-pypi "just" switches you to ARM64/aarch64 env, and then you can build & test yourself.
We use Github Actions
to build and run multiplatform (OSX, Windows, Linux, ARM64) tests and releases.
- the pr.yml runs on each pull-request (PR), builds for Linux(Ubuntu 20.04), Windows(2019), OSX(10.15) and checkes that all tests pass OK. This is mandatory for a new PR to be accepted.
- release.yml is created manually by the maintainers in the release process and creates
- binary GitHub releases
- PyPI wheels for
htm.core
- uploads artifacts
- arm.yml is an ARM64 build (that takes a long time) and thus is ran only daily.
This uses Docker and QEMU to achieve an ARM64 build on Actions' x86_64/amd64 hardware.
For Doxygen see docs README. For NetworkAPI see NetworkAPI docs.
Generate IDE solution & build.
- Choose the IDE that interest you (remember that IDE choice is limited to your OS).
- Open CMake executable in the IDE.
- Specify the source folder (
$HTM_CORE
) which is the location of the root CMakeList.exe. - Specify the build system folder (
$HTM_CORE/build/scripts
), i.e. where IDE solution will be created. - Click
Generate
.
After downloading the repository, do the following:
- NOTE: Visual Studio 2019 requires CMake version 3.14 or higher.
- CD to the top of repository.
- Double click on
startupMSVC.bat
- This will setup the build, create the solution file (build/scripts/htm.cpp.sln), and start MS Visual Studio.
- Select
Release
orDebug
as the Solution Configuration. Solution Platform must remain at x64. - Build everything. This will build the C++ library.
- In the solution explorer window, right Click on 'unit_tests' and select
Set as StartUp Project
so debugger will run unit tests. - If you also want the Python extension library; then delete the
build
folder and then in a command prompt, cd to root of repository and runpython setup.py install --user --force
.
Visual Studio Code can be used on any of our three platforms (Windows, Linux, OSx). You will need the C/C++ Tools extension by Microsoft and CMake Tools by vector-of-bool.
Startup Visual Studio Code and open the folder containing your htm.core repository which will set the workspace. Let it scan for a kit. Clear all of the notifications (lower right) so it can let you do the scan.
Then set your project level settings by initializing /.vscode/settings.json to the following as a starting point.
For Windows 10:
.vscode\settings.json
{
"cmake.buildDirectory": "${workspaceRoot}/build/scripts",
"cmake.generator": "Visual Studio 16 2019",
"cmake.platform": "x64",
}
To use Visual Studio 2017 as the tool chain, change generator to "Visual Studio 15 2017" and set the platform to "win32". Note that the ninja generator, the default, did not work very well on Windows.
For Ubuntu and OSx:
.vscode/settings.json
{
"cmake.buildDirectory": "${workspaceRoot}/build/scripts",
"cmake.generator": "gcc",
}
- File - new C/C++Project - Empty or Existing CMake Project
- Location: (
$HTM_CORE
) - Finish - Project properties - C/C++ Build - build command set "make -C build/scripts VERBOSE=1 install -j [number of your's CPU cores]"
- There can be issue with indexer and boost library, which can cause OS memory to overflow -> add exclude filter to your project properties - Resource Filters - Exclude all folders that matches boost, recursively
- (Eclipse IDE for C/C++ Developers, 2019-03 on Ubuntu 18.04)
For all new work, tab settings are at 2 characters, replace tabs with spaces. The clang-format is LLVM style.
Creating a debug build of the htm.core
library and unit tests is the same as building any C++
application in Debug mode in any IDE as long as you do not include the python bindings. i.e. do
not include -DBINDING_BUILD=Python3
in the CMake command.
(on Linux)
rm -r build
mkdir -p build/scripts
cd build/scripts
CMake -DCMAKE_BUILD_TYPE=Debug ../..
However, if you need to debug the python bindings using an IDE debugger it becomes a little more difficult.
The problem is that it requires a debug version of the python library, python37_d.lib
. It is possible to
obtain one and link with it, but a way to better isolate the python extension is to build a special main( )
as explained in debugging Python.
Be aware that the CMake maintains a cache of build-time arguments and it will ignore some arguments passed
to CMake if is already in the cache. So, between runs you need to clear the cache or even better,
entirely remove the build/
folder (ie. git clean -xdf
).
When you run python setup.py install --user --force
it will copy python scripts into build/Release/distr/src
and deploy as package into user site-packages (on linux in /home/.local/
).
To avoid deploying there use "development mode":
python setup.py develop --user --force
This will create link file in site-packages pointing to the distr folder. You can modify distr scripts and your changes will be reflected immediately.
Note: Unfortunately calling this command again will not overwrite distr scripts, so you need to delete distr folder first.
To remove the link file call:
python setup.py develop --user --uninstall
Note: you can always check from where you are importing sources, by typing into python console e.g.:
import htm.bindings.sdr
print(htm.bindings.sdr.__file__)
Note2: It is obvious, but anyway - do not use --user
option while using python environment managers(Anaconda..)
The installation scripts will automatically download and build the dependencies it needs.
- Boost (Not needed by C++17 compilers that support the filesystem module)
- LibYaml or Yaml-cpp
- Eigen
- PyBind11
- gtest
- cereal
- mnist test data
- sqlite3
- digestpp (for SimHash encoders)
- and python requirements.txt
Once these third party components have been downloaded and built they will not be
re-visited again on subsequent builds. So to refresh the third party components
or rebuild them, delete the folder build/ThirdParty
and then re-build.
If you are installing on an air-gap computer (one without Internet) then you can
manually download the dependencies. On another computer, download the
distribution packages as listed and rename them as indicated. Copy these to
${REPOSITORY_DIR}/build/ThirdParty/share
on the target machine.
Name to give it | Where to obtain it |
---|---|
libyaml.zip (*note1) | https://github.com/yaml/libyaml/archive/acd6f6f014c25e46363e718381e0b35205df2d83.zip |
boost.tar.gz (*note3) | https://dl.bintray.com/boostorg/release/1.72.0/source/boost_1_72_0.tar.gz |
googletest.tar.gz | https://github.com/google/googletest/archive/release-1.10.0.tar.gz |
eigen.tar.bz2 | https://gitlab.com/libeigen/eigen/-/archive/3.3.9/eigen-3.3.9.tar.bz2 |
mnist.zip (*note4) | https://github.com/wichtounet/mnist/archive/3b65c35ede53b687376c4302eeb44fdf76e0129b.zip |
pybind11.tar.gz | https://github.com/pybind/pybind11/archive/v2.6.2.tar.gz |
cereal.tar.gz | https://github.com/USCiLab/cereal/archive/v1.3.0.tar.gz |
sqlite3.tar.gz | https://www.sqlite.org/2021/sqlite-autoconf-3340100.tar.gz |
digestpp.zip | https://github.com/kerukuro/digestpp/archive/34ff2eeae397ed744d972d86b5a20f603b029fbd.zip |
cpp-httplib.zip(*note4) | https://github.com/yhirose/cpp-httplib/archive/v0.8.4.zip |
- note1: Version 0.2.2 of libyaml is broken so use the master for the repository.
- note3: Boost is not required for any compiler that supports C++17 with
std::filesystem
(MSVC2017, gcc-8, clang-9). - note4: Used for examples. Not required to run but the build expects it.
We support test-driven development with reproducible builds. You should run tests locally, and tests are also run as a part of the CI.
There are two sets (somewhat duplicit) tests for c++ and python.
- C++ Unit tests -- to run:
./build/Release/bin/unit_tests
- Python Unit tests -- to run:
python setup.py test
(runs also the C++ tests above)py/tests/
bindings/py/tests/
There are a number of python examples, which are runnable from the command line.
They are located in the module htm.examples
.
Example Command Line Invocation: $ python -m htm.examples.sp.hello_sp
Look in:
py/htm/examples/
py/htm/advanced/examples/
This is a simple example application that calls the SpatialPooler
and
TemporalMemory
algorithms directly. This attempts to predict the electrical
power consumption for a gymnasium over the course of several months.
To run python version:
python -m htm.examples.hotgym
To run C++ version: (assuming current directory is root of the repository)
./build/Release/bin/benchmark_hotgym
There is also a dynamically linked version of Hot Gym (not available on MSVC).
You will need specify the location of the shared library with LD_LIBRARY_PATH
.
To run: (assuming current directory is root of the repository)
LD_LIBRARY_PATH=build/Release/lib ./build/Release/bin/dynamic_hotgym
The task is to recognize images of hand written numbers 0-9. This is often used as a benchmark. This should score at least 95%.
To run: (assuming current directory is top of repository)
./build/Release/bin/mnist_sp
In Python:
python py/htm/examples/mnist.py
The REST interface for NetworkAPI provides a way to access the underlining htm.core library using a REST client. The examples provide both a full REST web server that can process the web requests that allow the user to create a Network object resource and perform htm operations on it. Message layout details can be found in NetworkAPI REST docs. To run:
./build/Release/bin/server [port [network_interface]]
A REST client, implemented in C++ is also provided as an example of how to use the REST web server. To run: first start the server.
./build/Release/bin/client [host [port]]
The default host is 127.0.0.1 (the local host) and the port is 8050.
The htm.core library is distributed under GNU Affero Public License version 3 (AGPLv3). The full text of the license can be found at http://www.gnu.org/licenses.
Libraries that are incorporated into htm.core have the following licenses:
Library | Source Location | License |
---|---|---|
libyaml | https://github.com/yaml/libyaml | https://github.com/yaml/libyaml/blob/master/LICENSE |
boost (*note3) | https://www.boost.org/ | https://www.boost.org/LICENSE_1_0.txt |
eigen | http://eigen.tuxfamily.org/ | https://www.mozilla.org/en-US/MPL/2.0/ |
pybind11 | https://github.com/pybind/pybind11 | https://github.com/pybind/pybind11/blob/master/LICENSE |
cereal | https://uscilab.github.io/cereal/ | https://opensource.org/licenses/BSD-3-Clause |
digestpp | https://github.com/kerukuro/digestpp | released into public domain |
cpp-httplib | https://github.com/yhirose/cpp-httplib | https://github.com/yhirose/cpp-httplib/blob/master/LICENSE |
- note3: Boost is not used if built with any compiler that supports C++17 with
std::filesystem
(MSVC2017, gcc-8, clang-9).
We're happy that you can use the community work in this repository or even join the development! Please give us attribution by linking to us as htm.core at https://github.com/htm-community/htm.core/ , and for papers we suggest to use the following BibTex citation:
@misc{htmcore2019,
abstract = "Implementation of cortical algorithms based on HTM theory in C++ \& Python. Research \& development library.",
author = "M. Otahal and D. Keeney and D. McDougall and others",
commit = bf6a2b2b0e04a1d439bb0492ea115b6bc254ce18,
howpublished = "\url{https://github.com/htm-community/htm.core/}",
journal = "Github repository",
keywords = "HTM; Hierarchical Temporal Memory; NuPIC; Numenta; cortical algorithm; sparse distributed representation; anomaly; prediction; bioinspired; neuromorphic",
publisher = "Github",
series = "{Community edition}",
title = "{HTM.core implementation of Hierarchical Temporal Memory}",
year = "2019"
}
Note: you can update the commit to reflect the latest version you have been working with to help making the research reproducible.
Numenta's BAMI The formal theory behind it all. Also consider Numenta's Papters.
HTM School is a set of videos that explains the concepts.
Indy's Blog
For questions regarding the theory can be posted to the HTM Forum.
Questions and bug reports regarding the library code can be posted in htm.core Issues blog.
Community projects for working with HTM.
This project aspires to create tool that helps visualize HTM systems in 3D by using opensource framework for 3D rendering https://www.panda3d.org/
NetworkAPI has region called "DatabaseRegion". This region can be used for generating SQLite file and later on read by PandaVis - DashVis feature, to show interactive plots in web browser on localhost. See napi_hello_database for basic usage.
For more info, visit repository of the project