-
Notifications
You must be signed in to change notification settings - Fork 40
Home
Welcome to the OpenNARS-for-Applications wiki! First, a few references:
July 26, 22, 2022: v0.9.x overview and functional diagram
Event providers: sensory channels and motivation module.
March 22, 2022, Intelligence through reasoning (postdoc project): https://www.youtube.com/watch?v=wLftx_srsMI
December 23, 2021, ONA Colab notebook: https://colab.research.google.com/drive/1YSfquCubY6_YnMAJLu8EJMJNZrkp5CIA?usp=sharing
October 17, 2021, ONA presentation at AGI-21 conference's NARS workshop: https://youtu.be/zrC9VA6TKBs?t=22311
July 16, 2021. PhD thesis: https://github.com/opennars/OpenNARS-for-Applications/files/6832325/Dissertation_PH_Submitted.pdf
July 21, 2021, presentation thereof: https://www.youtube.com/watch?v=B9SKu7u6G-I
A less recent comprehensive presentation: https://www.youtube.com/watch?v=RepnwEh_7hU
Publication: https://github.com/opennars/OpenNARS-for-Applications/releases/download/v0.8.3/ONA_Architecture_and_Control.pdf
Also please see the video tutorials: https://github.com/opennars/OpenNARS-for-Applications/wiki/Video-tutorials
The following diagram shows the high level architecture of the reasoner with the input sequencing and the two inference cycles; Sensorimotor and Semantic:
The Data dependencies between data types is shown in the following diagram:
Sensory Channels:
The reasoner allows for sensory input from multiple modalities. Each sensory channel essentially converts sensory signals to Narsese. Dependent on the nature of the modality, its internals may vary. As an example for application purposes, a Vision Channel could consist of a Multi-Class Multi-Object Tracker for the detection and tracking of instances and their type, and an encoder which converts the output into: The instances which were detected in the current moment, their type, visual properties, and spatial relationships among the instances.
Input: Sensor signals
Function: Essentially to periodically encode current signal values of a specific modality to Narsese, also a fixed mapping such as an ANN can be used here.
Output: Events with the Narsese encodings
Eviction: Nothing beyond current moment is kept, it's almost like a form of value override.
The Sequencer is responsible for multi-modal integration. It creates Spatial-Temporal patterns (compound events) from the events generated by the sensory channels. It achieves this by building both sequences and parallel conjunctions, dependent on their temporal distance.
Input: Events from the sensory channels
Function: Multi-modal integration which in each moment takes 1 piece from each modality into one ore a few composite events.
Output: Temporal sequences (which are also events) of the formed multi-modal composite events from length 1 to a small max. value.
Eviction: Only the newest k composite events are kept, simple first-in, first-out principle, hence the name.
Code: FIFO, Cycle
This is the global attention buffer of the reasoner. It maintains a fixed capacity: items are ranked according to priority, and when a new item enters, the lowest priority item is evicted. For selection, the highest-priority items are retrieved, both for semantic inference and sensorimotor inference, the retrieved items, and the inference results then go back into the cycling events queue after having passed through concept memory. Here, the item's priority decays on usage, but also decays in the queue, both decays are system parameters.
Input: Events coming from FIFO Sequencer, Semantic Inference, and Sensorimotor Inference
Function: Attention control point, it can also be seen as the short-term memory.
Output: Samples the highest priority event
Eviction: Only the m highest-priority events are kept, where priority decays over time via a global durability value.
Code: PriorityQueue, Cycle
This component is responsible for temporal and procedural reasoning, it uses NAL 6-8
- Formation and strenghtening of implication links between concepts, driven both by input sequences and derived events.
- Prediction of new events based on input and derived events, via implication links.
- Efficient subgoaling via implication links.
- Execution of decisions when a subgoal exceeds decision threshold.
Input: Events&implications from Concept Memory
Function: Implication formation and usage. For belief events, build implications via Induction, and use the event together with implications to make predictions via Deduction. For goal events on the other hand, the operation associated to the implication candidate which leads to the derivation of the highest desire value expectation goal above decision threshold (precondition fulfilled enough, implication truth value high enough), gets executed. Additionally, if it's below decision threshold, the preconditions of the implication candidates are derived goals (also via Deduction) to be passed on to Cycling Events Queue.
Output: Derived subgoals, operation executions leading to feedback, and predicted events.
Code: Inference, Cycle
All declarative reasoning NAL1-6 happens here.
As Inheritance can be seen as a way to describe objects in a universe of discouse, the related inference helps the reasoner to categorize events, and to refine these categorizations with further experience. Ultimately this allows the reasoner to learn and use arbitrary relations, to interpret situations in richer ways and find crucial commonalities and differences between all sorts of things. Also, due to the descriptive power of NAL and its experience-grounded semantics, semi-natural communication with the reasoner becomes possible, and high-level knowledge can directly be communicated. This is the case even when the meaning of some terms is not yet clear and needs to be enriched to become more useful.
Input: Events&Beliefs from Concept Memory
Function: General declarative reasoning by applying NAL1-6 inference rules to the received event and belief.
Output: Derived events of various kind to be passed on to Cycling Events Queue.
Code: NAL, RuleTable, Cycle
The concept store of the system. Similar to the cycling events queue, it maintains a fixed capacity: but instead of being ranked by priority, items are ranked according to usefulness, and when a new item enters, the lowest useful item is evicted. Usefulness takes both the usage count and last usage time into account, to capture the long term quality of the item, but to give new items a chance as well. All events from the cycling events queue, both input and derived, that weren't evicted from the queue, arrive at this block. This block creates a concept node for each Inheritance statement, or activates it with the event priority if it already exists. It also performs revision of knowledge in the individual statement's concept. Additionally, it holds the implications which were formed by the sensorimotor component, which manifest as implication links between concepts. The activation of concepts allows the reasoner's inference to be contextual: only the beliefs of the highest priority concepts, which share a common term with the event selected from the event queue, or are temporally related through an implication link, will be retrieved for inference. The inference results produced (either semantic or sensorimotor inference), will be assigned a priority which takes the following into account:
- Belief concept priority (context)
- Truth of the conclusion (truth)
- Priority of the event which triggered the inference (Priority_child < Priority_parent)
- Complexity of the result
Input: Events from cycling events queue
Function: Build concepts based on incoming events (can be seen as long-term memory), and match incoming events to a max. number of semantically or temporally related highest-priority concepts
Output: The beliefs/implications in the matched concepts are passed on, together with the event, to Semantic Inference/Sensorimotor Inference.
Eviction: Only the n most useful concepts are kept, where usefulness is a function of use count and last used time.
Code: PriorityQueue, Memory, Cycle