Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SURFconext auth module and nginx integration #70

Merged
merged 8 commits into from
Jan 12, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 1 addition & 3 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,7 @@ yarn-error.log*

# local env files frontend
.env.local
.env.development.local
.env.test.local
.env.production.local
.env.*.local

# vercel
.vercel
Expand Down
38 changes: 19 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
![](https://user-images.githubusercontent.com/4195550/136156498-736f915f-7623-43d2-8678-f30b06563a38.png)

# RSD-as-a-service

## Our mission: To promote the visibility, impact and reuse of research software
Expand All @@ -7,40 +8,39 @@ This repo contains the new RSD-as-a-service implementation

## Building

The program can easily be built with `docker-compose`:
The program can easily be built with `docker-compose`. Each service builds the image using specific version (see docker-compose.yml file). Ensure that version number is increased in docker-compose.yml file when the source code of that service is changed.

1. Run the command `docker-compose build`. You might see an error from the service `data-migration_1`. This is intentional and is used as a workaround to only download dependencies if needed but to not start the program. The whole process itself should end successfully.
- run the command `docker-compose build`.

## Running locally

Navigate to frontend folder and copy env.local.example file to `env.production.local` and provide missing values.

- `env.local` file is used when running frontend locally with `yarn dev`
- `env.production.local` file is used when running frontend with docker compose `docker-compose up`

More information about the [frontend setup is avaliable here](frontend/README.md).

The program can easily be build with `docker-compose`. To run with the data migration script:
1. Navigate to frontend folder and copy env.local.example file to `env.production.local` and provide the missing values. More information about the [frontend setup is avaliable here](frontend/README.md).

1. Run the command `docker-compose up`. As a lot of data is downloaded and processed, this can take a while (think 15 to 30 seconds).
2. Run the command `docker-compose up`. The application can be viewed on http://localhost

2. After the service `data-migration_1` has exited with code `0`, you can view the data at various endpoints, e.g. http://localhost:3500/software and http://localhost:3500/project?select=_,output:mention!output_for_project(_),impact:mention!impact_for_project(\*).
```bash
docker-compose up
```

To run without the data migration script:
## Clear/remove data (reset)

1. Run the command `docker-compose up --scale data-migration=0`.
To clear the database, if the database structure has changed or you need to run data migration again, run the command:

To run without frontend and data migration in docker-compose:
```bash
docker-compose down --volumes
```

1. Run the command `docker-compose up --scale frontend=0 --scale data-migration=0`.
## Data migration

A service for automated testing is also included. It is called `test` and can as such be included with its name or excluded with `--scale test=0`. The tests assume that the database is empty at the start and will delete all content from the database as final tests so they cannot be run in conjunction with the `data-migration` service.

To clear the database (if the database structure has changed for example) before repeating the process:

1. Run the command `docker-compose down --volumes`.
- run current RSD solution using `docker-compose up` from the root of the project
- run the migration script using docker-compose file in the data-migration folder

## Tech Stack
![image](https://user-images.githubusercontent.com/4195550/147217992-0ae7fd21-e775-4b9d-ba5a-b4f50576936f.png)
More information about [data migration is avaliable here](data-migration/README.md).

## Tech Stack

![image](https://user-images.githubusercontent.com/4195550/147217992-0ae7fd21-e775-4b9d-ba5a-b4f50576936f.png)
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ public JwtCreator(String signingSecret) {

String createUserJwt(String account) {
return JWT.create()
.withClaim("iss", "rsd_auth")
.withClaim("role", "rsd_user")
.withClaim("account", account)
.withExpiresAt(new Date(System.currentTimeMillis() + ONE_HOUR_IN_MILLISECONDS))
Expand All @@ -30,6 +31,7 @@ String createUserJwt(String account) {

String createAdminJwt() {
return JWT.create()
.withClaim("iss", "rsd_auth")
.withClaim("role", "rsd_admin")
.withExpiresAt(new Date(System.currentTimeMillis() + ONE_HOUR_IN_MILLISECONDS))
.sign(SIGNING_ALGORITHM);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@
import java.util.Properties;

public class Main {

static final long ONE_HOUR_IN_MILLISECONDS = 3600_000L; // 60 * 60 * 1000
static final long ONE_HOUR_IN_SECONDS = 3600; // 60 * 60
static final Properties CONFIG = new Properties();

public static void main(String[] args) throws IOException {
Expand All @@ -42,29 +42,43 @@ public static void main(String[] args) throws IOException {
});

app.post("/login/surfconext", ctx -> {
String code = ctx.formParam("code");
String redirectUrl = CONFIG.getProperty("AUTH_SURFCONEXT_REDIRECT_URL");
String account = new SurfconextLogin(code, redirectUrl).account();
JwtCreator jwtCreator = new JwtCreator(CONFIG.getProperty("PGRST_JWT_SECRET"));
String token = jwtCreator.createUserJwt(account);
setJwtCookie(ctx, token);
ctx.result(token);
});

app.get("/login/surfconext", ctx -> {
String redirectUrl = CONFIG.getProperty("AUTH_SURFCONEXT_REDIRECT_URL");
ctx.html("<a href=\"https://connect.test.surfconext.nl/oidc/authorize?scope=openid&&response_type=code&redirect_uri=" + redirectUrl + "&state=example&nonce=example&response_mode=form_post&client_id=" + CONFIG.getProperty("AUTH_SURFCONEXT_CLIENT_ID") + "\">Login with surfconext</a>");
try{
String returnPath = ctx.cookie("rsd_pathname");
String code = ctx.formParam("code");
String redirectUrl = CONFIG.getProperty("NEXT_PUBLIC_SURFCONEXT_REDIRECT");
String account = new SurfconextLogin(code, redirectUrl).account();
JwtCreator jwtCreator = new JwtCreator(CONFIG.getProperty("PGRST_JWT_SECRET"));
String token = jwtCreator.createUserJwt(account);
setJwtCookie(ctx, token);
// redirect based on info
if (returnPath != null && !returnPath.trim().isEmpty()){
returnPath = returnPath.trim();
ctx.redirect(returnPath);
}else{
ctx.redirect("/");
}
}catch (RuntimeException ex){
ex.printStackTrace();
ctx.status(400);
ctx.redirect("/login/failed");
}
});

app.get("/refresh", ctx -> {
String tokenToVerify = ctx.cookie("rsd_token");
String signingSecret = CONFIG.getProperty("PGRST_JWT_SECRET");
JwtVerifier verifier = new JwtVerifier(signingSecret);
verifier.verify(tokenToVerify);
try{
String tokenToVerify = ctx.cookie("rsd_token");
String signingSecret = CONFIG.getProperty("PGRST_JWT_SECRET");
JwtVerifier verifier = new JwtVerifier(signingSecret);
verifier.verify(tokenToVerify);

JwtCreator jwtCreator = new JwtCreator(signingSecret);
String token = jwtCreator.refreshToken(tokenToVerify);
setJwtCookie(ctx, token);
JwtCreator jwtCreator = new JwtCreator(signingSecret);
String token = jwtCreator.refreshToken(tokenToVerify);
setJwtCookie(ctx, token);
}catch (RuntimeException ex){
ex.printStackTrace();
ctx.status(400);
ctx.json("{\"message\": \"failed to refresh token\"}");
}
});

app.exception(JWTVerificationException.class, (ex, ctx) -> {
Expand All @@ -75,7 +89,7 @@ public static void main(String[] args) throws IOException {
}

static void setJwtCookie(Context ctx, String token) {
ctx.header("Set-Cookie", "rsd_token=" + token + "; Secure; HttpOnly; Path=/; SameSite=Lax");
ctx.header("Set-Cookie", "rsd_token=" + token + "; Secure; HttpOnly; Path=/; SameSite=Lax; Max-Age=" + ONE_HOUR_IN_SECONDS);
}

static String decode(String base64UrlEncoded) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ private Map<String, String> createForm() {
form.put("grant_type", "authorization_code");
form.put("redirect_uri", REDIRECT_URL);
form.put("scope", "openid");
form.put("client_id", CONFIG.getProperty("AUTH_SURFCONEXT_CLIENT_ID"));
form.put("client_id", CONFIG.getProperty("NEXT_PUBLIC_SURFCONEXT_CLIENT_ID"));
form.put("client_secret", CONFIG.getProperty("AUTH_SURFCONEXT_CLIENT_SECRET"));
return form;
}
Expand Down
66 changes: 65 additions & 1 deletion data-migration/README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,66 @@
## Data migration
# Data migration

This directory contains a `java` program that migrates data from the existing RSD to the new RSD. It is currently under construction. After the data has been migrated and the new RSD is in production, this script will be deprecated.

## Build solution

```bash
docker-compose build
```

You might see an error from the service `data-migration_1`. This is intentional and is used as a workaround to only download dependencies if needed but to not start the program. The whole process itself should end successfully.

## Run solution

```bash
# ensure main RSD application is started too
# cd ../ & docker-compose up
# run data-migation docker compose
docker-compose up
```

## Possible errors

When running data migration script multiple times the script might fail with the `duplicate key violation` error. The script migrates all data in one go. If the script fails in the middle of the migration you will need to remove all data from the database prior to runnin the script again.

```bash
data-migration | [ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:3.0.0:java (default-cli) on project data-migration: An exception occured while executing the Java class. Error fetching data from the endpoint: {"hint":null,"details":null,"code":"23505","message":"duplicate key value violates unique constraint \"software_slug_key\""} -> [Help 1]

...

data-migration exited with code 1
```

The easiest way to remove data is to stop main docker-compose session and remove data volumes: `docker-compose down --volumes`. Then start main app docker compose and data migration docker compose.

```bash
# back to root folder
cd ..
# stop main app and remove database volume
docker-compose down --volumes
# start app again (postgres serverice will create all tables & views)
docker-compose up
# move to data-migration
cd data-migration
# run data migration docker-compose
docker-compose up
```

The succefull data migration log should be similair to one bellow

```bash
data-migration | [INFO] Scanning for projects...
data-migration | [INFO]
data-migration | [INFO] ----------------< nl.research-software:data-migration >-----------------
data-migration | [INFO] Building data-migration 1.0-SNAPSHOT
data-migration | [INFO] --------------------------------[ jar ]---------------------------------
data-migration | [INFO]
data-migration | [INFO] --- exec-maven-plugin:3.0.0:java (default-cli) @ data-migration ---
data-migration | [INFO] ------------------------------------------------------------------------
data-migration | [INFO] BUILD SUCCESS
data-migration | [INFO] ------------------------------------------------------------------------
data-migration | [INFO] Total time: 7.317 s
data-migration | [INFO] Finished at: 2022-01-03T09:33:54Z
data-migration | [INFO] ------------------------------------------------------------------------
data-migration exited with code 0
```
12 changes: 12 additions & 0 deletions data-migration/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
version: "3.0"

services:
data-migration:
container_name: data-migration
build:
context: ../
dockerfile: ./data-migration/dockerfile
image: rsd/data-migration:0.0.3
environment:
PROPERTIES: "/usr/mymaven/.env.production.local"
network_mode: "host"
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ public class Main {
public static final String LEGACY_RSD_PERSON_URI = "https://research-software.nl/api/person";
public static final String LEGACY_RSD_MENTION_URI = "https://research-software.nl/api/mention";
public static final String LEGACY_RSD_RELEASE_URI = "https://research-software.nl/api/release";
public static final String POSTGREST_URI = "http://localhost:3500";
public static final String POSTGREST_URI = "http://localhost/api/v1";

public static void main(String[] args) throws IOException {
CONFIG.load(new FileReader(args[0]));
Expand Down Expand Up @@ -104,7 +104,7 @@ public static void tryBackendConnection() {
for (int tryConnectionCount = 0; tryConnectionCount < maxTries; tryConnectionCount++) {
pauseExecution(500);
try {
getPostgREST(URI.create(POSTGREST_URI));
getPostgREST(URI.create(POSTGREST_URI + "/"));
} catch (RuntimeException e) {
continue;
}
Expand Down
64 changes: 42 additions & 22 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,12 @@ services:
database:
container_name: database
build: ./database
image: rsd/database:0.0.1
ports:
# enable connection from outside
- "5432:5432"
# expose:
# - 5432
environment:
POSTGRES_DB: "rsd-db"
POSTGRES_USER: "rsd"
Expand All @@ -21,39 +25,38 @@ services:
backend:
container_name: backend
build: ./backend-postgrest
ports:
- "3500:3500"
image: rsd/backend-postgrest:0.0.1
# ports:
# - "3500:3500"
expose:
- 3500
environment:
PGRST_DB_URI: "postgres://authenticator:simplepassword@database:5432/rsd-db"
PGRST_DB_ANON_ROLE: "web_anon"
PGRST_DB_SCHEMA: "public"
PGRST_SERVER_PORT: "3500"
env_file:
- ./frontend/.env.production.local
networks:
- net

data-migration:
container_name: data-migration
build:
context: .
dockerfile: ./data-migration/dockerfile
depends_on:
- "database"
- "backend"
environment:
PROPERTIES: "/usr/mymaven/.env.production.local"
network_mode: "host"
networks:
- net

authentication:
container_name: authentication
auth:
container_name: auth
build:
context: .
dockerfile: ./authentication/dockerfile
ports:
- "7000:7000"
image: rsd/auth:0.0.4
# ports:
# - "7000:7000"
expose:
- 7000
environment:
PROPERTIES: "/usr/mymaven/.env.production.local"
depends_on:
- "database"
- "backend"
networks:
- net

Expand All @@ -64,20 +67,37 @@ services:
# dockerfile to use for build
dockerfile: Dockerfile
# update version number to corespond to frontend/package.json
image: rsd/frontend:0.3.0
image: rsd/frontend:0.3.2
env_file:
# configuration
- ./frontend/.env.production.local
# ports:
# - "3000:3000"
expose:
- 3000
depends_on:
- "database"
- "backend"
- "auth"
networks:
- net

nginx:
container_name: nginx
image: nginx:1.21
ports:
- "3000:3000"
- "80:80"
depends_on:
- "database"
- "backend"
- "auth"
- "frontend"
volumes:
- ./nginx/nginx.conf:/etc/nginx/conf.d/default.conf
networks:
- net

test:
container_name: test
container_name: auth-test
build:
context: .
# dockerfile to use for build
Expand Down
Loading