Sample of hexagonal architecture to handle news creation and news retrieval
Tech stack for this sample will be like this:
- Kafka for message broker
- Elasticsearch for searching
- Redis for result caching
- MySQL to save data
- Go Chi for HTTP endpoint router
You need docker compose installed in your local machine.
And in this project already included docker compose file to setup following server:
- Kafka
- ElasticSearch
- Redis
- MySQL
To run all those servers on your local just go to this root directory project and run this following command:
docker-compose up -d
But in case this docker compose is not working, you can install all those servers as a standalone application.
This application support 2 kind of database MySQL and MongoDB to prove our ports is completely agnostic from the implementation.
By default it will connect into our MySQL database with default host & port 127.0.0.1:3306
and database news
.
To connect into different database we need to set database information in environment variable like following example:
- MongoDB
set url=mongodb://localhost:27017/local
set timeout=10
set db=local
set driver=mongo
- MySQL
set url=root:root@tcp(127.0.0.1:3306)/news
set timeout=10
set db=news
set driver=mysql
set redis_url=redis://:@localhost:6379/0
set redis_timeout=10
set elastic_url=http://localhost:9200
set elastic_timeout=10
set elastic_index=news
set kafka_url=localhost:9092
set kafka_timeout=10
set kafka_topic=news
After setting the database information we only need to run the main.go file
go run main.go
Here is our API List and its payload:
- [GET] /news?offset=0&limit=10
/news?offset=0&limit=10
- [POST] /news
{
ID: 15,
Author: "Rest",
Body: "Hello this is news from REST",
Created: "2020-03-01T22:59:59.999Z"
}
- [PUT] /news/{news_id}
/news/15
{
ID: 15,
Author: "Rest",
Body: "Hello this is news from REST",
Created: "2020-03-01T22:59:59.999Z"
}
- [DELETE] /news/{news_id}
/news/15
We have our service which is a news and it will connect to serializer which will either serialize the data into json or message pack before serving it through REST API
And then on the other side we have our repository which will either choose to use MySQL or MongoDB based on how we start the application from command line.
So basically our API will be able to accept JSON or message pack format and also our repository is able to use both MySQL and MongoDB and it won't really affect our service
Here is table structure for MySQL table:
- id INT
- author TEXT
- body TEXT
- created TIMESTAMP
The apps flow would be like this:
- Create news using [POST] /news url:
- it will be sent to kafka producer
- kafka consumer will get the data from kafka producer and will store the complete data into mySQL database and for ID & created data will be stored in ElasticSearch (ES)
- Retrieve news using [GET] /news url:
- fetch the data from redis and return the data to user
- if data in redis already expired or not exists, it will fetch the data from elasticsearch
- data get from elasticsearch will have offset and limit and it will be ordered descending by date creation (created field)
- after get data from elasticsearch, it will fetch the data from database one by one using go routine worker
- after get the data from database it will store the data into redis as a cache data
- Update news using [PUT] /news url:
- update data in persistence database (MySQL or MongoDB)
- update data in elasticsearch
- update data in cache databse (Redis)
- Delete news using [DELETE /news url:
- delete data in persistence database (MySQL or MongoDB)
- delete data in elasticsearch
- delete data in cache databse (Redis)
By implementing Hexagonal Architecture we also implement Dependency Inversion and Dependency Injection. Here is some explanations about project structure:
- api
contains handler for API - models
contains data models - repositories
contains Port interface for repository adapter- mysql
contains mySQL Adapter that implement NewsRepository interface. This package will store mySQL client and connect to mySQL database to handle database query or command. Complete news data will be stored here. - mongodb
contains mongoDB Adapter that implement NewsRepository interface. This package will store mongoDB client and connect to mongoDB database to handle database query or command. Complete news data will be stored here. - redis
contains redis Adapter that implement CacheRepository interface. This package will store redis client and connect to redis server to handle database query or data manipulation - elasticsearch
contains elasticsearch Adapter that implement ElasticRepository interface. This package will store elasticsearch client and connect to elasticsearch server to handle database query or command. ID and news date creation will be stored here. - kafka
contains kafka Adapter that store kafka connection and has several methods to handle message write and message read from kafka server.
- mysql
- serializer
contains Port interface for decode and encode serializer. It will be used in our API to decode and encode data.- json
contains json Adapter that implement serializer interface to encode and decode data - msgpack
contains message pack Adapter that implement serializer interface to encode and decode data
- json
- services
contains Port interface for our domain service and logic - logic
contains service Adapter that implement service interface to handle service logic like constructing repository parameter and calling repository interface to do data manipulation or query