Skip to content

Latest commit

 

History

History
125 lines (62 loc) · 4.83 KB

design.md

File metadata and controls

125 lines (62 loc) · 4.83 KB

Design

A developer overview of the code design for the impervious daemon. The subcategories below will describe the submodules that exist in this codebase.

cmd

The entry point for running the impervious application.

configure.go

The initialization for pretty much all of the submodules that exist throughout this codebase. This can be used by the impd daemon and the impcli command line interface.

impcli

The entry point for the impervious cli. This process allows for one off commands to run and shut down instead of an always running daemon in the background.

impd

The entry point for the impervious daemon. This runs in the background with HTTP/GRPC servers for commanding the daemon, and listening ports for DIDComm messaging.

comm

The communication/transport layer for sending messages over DIDComm. Message types, http transport, and encrypting/signing each message happens here.

config

All of the config stuff goes here.

config.go

This is the structured golang file for parsing the yml into. Should match the config.yml file.

config.yml

The yaml equivalent to the config. This example should try to be kept up to date. You may copy and paste this into local.config.yml or any other name to utilize it for your purposes locally.

contacts

This manages the locally saved contacts for the users. Mostly used for mapping a contact to a DID, with associated metadata for the user.

core

This is where many components get merged together into a submodule that is meant to be the entry point for commands. Both CLI and HTTP/GRPC commands can utilize the core submodule. This typically means that, for instance, if contacts APIs should be exposed to the user, first the core should import the contacts submodule, then expose a public method for any of the contacts APIs to utilize.

First add the method to the Core interface, then implement it in the core structure.

gen

This is where the auto generated code will output, mostly used for GRPC libraries in other languages. Running make proto creates this.

id

Where most of the generate DID logic goes, such as resolving/updates. From Peer based DIDs to ION. ID state is managed here along with any updates to DIDs that need to occur through an ION client.

ion

This is the client library to interface with a real ION node.

ipfs

IPFS based functionality goes here. Things like adding or resolving an IPFS file.

key

Cryptography/key based actions go here. From managing an encrypted master seed to encryption & signing.

lightning

All of the lightning based interactions that we need. A lightning node manager handles being able to utilize one or many nodes for resiliency.

node

Where the actual lightning node client logic resides. The lightning node manager will pass through functionality to an instance of node in this submodule.

messages

The high level API logic for sending messages to another DID.

proto

Where the grpc APIs are structured. From our proto files, autogenerated files such as server implementation and client libraries are created from here. For our services, put them in the imp proto submodule.

server

Where the GRPC API server is implemented. Which is typically to pass them through to core.

service

Service handles business logic when it comes to the sending/receiving of messages. Specific services can be created here and keyed off of the type field to auto direct messages to.

message

When a service wants to send/reply to a message, it goes through the message submodule.

relay

A relay service that can optionally run on the daemon. For daemons that wish to be a relay, it can run in relay mode by delgating that in the configs. Relay registration requests can be routed to the relay service automatically.

Users that wish to delegate a relay can also run this service as a relay-registration service. This allows for the sending and responding when it comes to the client side of the interactions between users and relays. For instance, requesting stored messages from the relay happen here for users.

signing

Signs/verifies a message with the lightning node. TODO this should be moved as it no longer reflects a service by our terminology.

handler.go

Where the decryption/verification and routing of messages takes place. This handler will also auto route all messages coming in to a websocket if the user is listening for them.

messenger.go

The entry point for sending service messages.

service.go

The structure for how a service should be implemented as.

websocket.go

If a user wishes to actively listen to the messages coming into the daemon through DIDComm, they may subscribe to the websocket and receive all messages.

state

The data/state logic for our daemon. The SQL implementation may either be remotely with a MySQL instance or locally with an encrypted sqlite file.