Skip to content

Commit

Permalink
Merge pull request #934 from IGS/devel
Browse files Browse the repository at this point in the history
Merging to deploy #926
  • Loading branch information
adkinsrs authored Oct 30, 2024
2 parents 70bbdc5 + d31c2fa commit fb3be64
Show file tree
Hide file tree
Showing 10 changed files with 83 additions and 84 deletions.
37 changes: 27 additions & 10 deletions create_schema.sql
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
-- drop database gear_portal; create database gear_portal; use gear_portal;
-- source /home/jorvis/git/gEAR/create_schema.sql

CREATE TABLE organism (
id INT PRIMARY KEY AUTO_INCREMENT,
label VARCHAR(255) NOT NULL,
Expand Down Expand Up @@ -36,10 +39,9 @@ CREATE TABLE guser (
help_id VARCHAR(50),
date_created DATETIME DEFAULT CURRENT_TIMESTAMP,
default_org_id INT NOT NULL DEFAULT 1,
layout_id VARCHAR(24),
is_curator TINYINT(1) DEFAULT 0
FOREIGN KEY fk_guser_doi(default_org_id) REFERENCES organism(id),
FOREIGN KEY fk_guser_layout(layout_share_id) REFERENCES layout(share_id) ON DELETE CASCADE
layout_id INT,
is_curator TINYINT(1) DEFAULT 0,
FOREIGN KEY fk_guser_doi(default_org_id) REFERENCES organism(id)
) ENGINE=INNODB;

-- password is a hashlib md5 hexdigest
Expand Down Expand Up @@ -354,6 +356,10 @@ CREATE TABLE layout (
REFERENCES guser(id)
ON DELETE CASCADE
) ENGINE=INNODB;

-- Adding this here so we miss a chicken/egg problem, since both reference each other
ALTER TABLE guser ADD CONSTRAINT FOREIGN KEY fk_guser_layout(layout_id) REFERENCES layout(id) ON DELETE CASCADE;

INSERT INTO layout VALUES (0, 0, NULL, "Hearing (default)", 1);
INSERT INTO layout VALUES (10000, 0, NULL, "Brain development (default)", 0);
INSERT INTO layout VALUES (10001, 0, NULL, "Huntingtons disease (default)", 0);
Expand Down Expand Up @@ -425,15 +431,26 @@ CREATE TABLE tag (
label VARCHAR(55)
) ENGINE=INNODB;

CREATE TABLE comment (
id int PRIMARY KEY AUTO_INCREMENT,
first_name varchar(255) DEFAULT NULL,
last_name varchar(255) DEFAULT NULL,
user_id int NOT NULL,
email varchar(255) DEFAULT NULL,
title varchar(255) DEFAULT NULL,
message varchar(1020) DEFAULT NULL,
is_read tinyint DEFAULT 0,
date_added datetime DEFAULT NULL,
FOREIGN KEY comment_ibfk_1 (user_id) REFERENCES guser(id)
) ENGINE=INNODB;

-- multiple tags to multiple comments
CREATE TABLE comment_tag (
id INT PRIMARY KEY AUTO_INCREMENT,
tag_id INT,
comment_id INT,
id INT PRIMARY KEY AUTO_INCREMENT,
tag_id INT,
comment_id INT,
FOREIGN KEY (tag_id) REFERENCES tag(id),
FOREIGN KEY (comment_id)
REFERENCES comment(id)
ON DELETE CASCADE
FOREIGN KEY (comment_id) REFERENCES comment(id) ON DELETE CASCADE
) ENGINE=INNODB;

CREATE TABLE dataset_tag (
Expand Down
9 changes: 7 additions & 2 deletions docs/setup.apache.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,13 +53,13 @@ $ sudo a2enmod include
Then the Directory commands can look like this. Would be nice to find why combining these
causes errors.

<Directory /var/www/html>
<Directory /var/www>
Options Indexes FollowSymLinks
AllowOverride None
Require all granted
</Directory>

<Directory /var/www/html>
<Directory /var/www>
Options +ExecCGI +Includes
AddHandler cgi-script .py .cgi
AddOutputFilter INCLUDES .html
Expand Down Expand Up @@ -119,6 +119,11 @@ Resources:
This needs to be tailored for each machine's resources to match the processors (cores) present
and number of threads within each process. If not running under SSL, this goes in 000-default.conf

Finally, you need to make sure that ssl.conf file is symlinked under sites-enabled

$ cd /etc/apache2/sites-enabled
$ sudo ln -s ../sites-available/umgear-ssl.conf .

There's a lot in here, but the CGI-related addition is:

<Directory /var/www/cgi>
Expand Down
25 changes: 4 additions & 21 deletions docs/setup.new_server.notes.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Info for setting up or upgrading a new server

## These operations were last performed on Ubuntu 20.04 LTS
## These operations were last performed on Ubuntu 22.04 LTS

Instances of a gEAR Portal are most often run within a cloud instance, where you can choose your own operating system and resources. On Google Cloud for a starter instance I chose an e2-standard-2 (2 vCPUs and 48GB RAM) with 300GB of solid state disk space. You'll definitely want to increase the CPU as you gain more simultaneous users and RAM depending on your dataset sizes. Once you create and start the instance:

Expand All @@ -14,27 +14,10 @@ Reboot if there are kernel updates (or just to be safe if you don't know.)
```bash
cd && mkdir git
sudo apt install git
cd git
git clone https://github.com/IGS/gEAR.git
```

### Updating Ubuntu version to 22.04 LTS

To check current version, perform `lsb_release -a`

`sudo apt install update-manager-core` (which should already be installed)
`sudo apt update && sudo apt dist-upgrade`
`sudo do-release-upgrade`

You may be prompted to perform a reboot at this point in order to do the upgrade, which is done with `sudo reboot`. This will kick you out of the VM. Just ssh back in and do `sudo do-release-upgrade`.

Follow the prompts and let the upgrade do its thing. There is a note that it can take several hours, so keep that in mind. The upgrade will also prompt for another restart of the server.

Do another `lsb_release -a` to confirm the version upgrade (should be 22.04 Jammy)

`sudo apt update`

At this point, we could remove the Python2 packages using `sudo apt autoremove`. From my experience Python2-related packages were the only things to go.

### MYSQL

sudo apt install mysql-server
Expand All @@ -47,13 +30,13 @@ Not necessary if you want projectR to run on a Google Cloud Run service (configu

`sudo apt install r-base`

Please consult `setup_notes_r_rpy2.md` for packages to install in order to install requisite R packages
Please consult `setup.r_rpy2.md` for packages to install in order to install requisite R packages

### RabbitMQ

Not necessary if you want projectR to run in the Apache environment or do not want to setup the RabbitMQ messaging service (configurable in gear.ini)

Follow instructions in setup_rabbit_mq.md document
Follow instructions in setup.rabbit_mq.md document

### Python

Expand Down
2 changes: 2 additions & 0 deletions docs/setup.python.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,9 +89,11 @@ Scanpy (or dependencies like numba) assumes it can write in several directories
$ find ./ -name __pycache__ -exec chmod 777 {} \;

NOTE: Installing custom version of diffxpy that is based on the latest commit on the main branch (at the time). It does not have a release tag, but fixes a NumPy bug occurs with older diffxpy commits and newer numpy releases.

$ /opt/Python-${PYTHONV}/bin/python3 -m pip install git+https://github.com/theislab/diffxpy.git@7609ea935936e3739fc4c71b75c8ee8ca57f51ea

The MulticoreTSNE module currently fails with cmake 3.22.0 or greater. I have a pending pull request to fix this but until then:

$ /opt/Python-${PYTHONV}/bin/python3 -m pip install git+https://github.com/jorvis/Multicore-TSNE.git@68325753c4ab9758e3d242719cd4845d751a4b6c

## Note about Flask
Expand Down
13 changes: 8 additions & 5 deletions docs/setup_notes_r_rpy2.md → docs/setup.r_rpy2.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,16 @@ The base version of R installed on Ubuntu Bionic (18.04) is not a high enough ve

## Prerequesites to install via apt-get

`sudo apt-get install gfortran libbz2-dev libcurl4-openssl-dev liblzma-dev libpcre3 libpcre3-dev libgomp1 libharfbuzz-dev libfribidi-dev libfreetype6-dev libpng-dev libtiff5-dev libjpeg-dev`
`sudo apt-get install gfortran libbz2-dev libssl-dev libfontconfig1-dev libxml2-dev libcurl4-openssl-dev liblzma-dev libpcre3 libpcre3-dev libgomp1 libharfbuzz-dev libfribidi-dev libfreetype6-dev libpng-dev libtiff5-dev libjpeg-dev`

## Installing R

Run `sudo sh <gEAR\_git\_root>/services/projectr/install_bioc.sh` to install R, Bioconductor, and projectR
Run this to install R, Bioconductor, and projectR:

```text
cd <gEAR\_git\_root>/services/projectr/
sudo sh ./install_bioc.sh
```

To ensure R's shared libraries are found create a file "libR.conf" in /etc/ld.so.conf.d and add the following contents:

Expand All @@ -21,6 +26,4 @@ Then run `sudo ldconfig` to cache the shared libraries. You can confirm shared

## Installing rpy2

`<python\_bin>/pip3 install rpy2==3.5.1`

Later versions seem to have an error that is not an issue with this version
This will be done in the Python installation doc steps.
43 changes: 3 additions & 40 deletions docs/setup_rabbitmq.md → docs/setup.rabbitmq.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,46 +2,9 @@

RabbitMQ is needed to act as a message broker, so that some of the load is taken off of the Flask instance when performing API requests. In particular, this is relevant to the projectR API calls, where some of the dataset operations can be memory-intensive. By putting this responsibility outside of the Apache worker, we can hopefully prevent Apache from crashing, and better control the load of memory-intensive requests

To install RabbitMQ on your server, run this script (taken from https://www.rabbitmq.com/install-debian.html#apt-quick-start-cloudsmith). Do note that the script was outdated so I updated "bionic" to "jammy" to reflect the version of Ubuntu being used.

```bash
#!/usr/bin/sh

sudo apt-get install curl gnupg apt-transport-https -y

## Team RabbitMQ's main signing key
curl -1sLf "https://keys.openpgp.org/vks/v1/by-fingerprint/0A9AF2115F4687BD29803A206B73A36E6026DFCA" | sudo gpg --dearmor | sudo tee /usr/share/keyrings/com.rabbitmq.team.gpg > /dev/null
## Cloudsmith: modern Erlang repository
curl -1sLf https://dl.cloudsmith.io/public/rabbitmq/rabbitmq-erlang/gpg.E495BB49CC4BBE5B.key | sudo gpg --dearmor | sudo tee /usr/share/keyrings/io.cloudsmith.rabbitmq.E495BB49CC4BBE5B.gpg > /dev/null
## Cloudsmith: RabbitMQ repository
curl -1sLf https://dl.cloudsmith.io/public/rabbitmq/rabbitmq-server/gpg.9F4587F226208342.key | sudo gpg --dearmor | sudo tee /usr/share/keyrings/io.cloudsmith.rabbitmq.9F4587F226208342.gpg > /dev/null

## Add apt repositories maintained by Team RabbitMQ
sudo tee /etc/apt/sources.list.d/rabbitmq.list <<EOF
## Provides modern Erlang/OTP releases
##
deb [signed-by=/usr/share/keyrings/io.cloudsmith.rabbitmq.E495BB49CC4BBE5B.gpg] https://dl.cloudsmith.io/public/rabbitmq/rabbitmq-erlang/deb/ubuntu jammy main
deb-src [signed-by=/usr/share/keyrings/io.cloudsmith.rabbitmq.E495BB49CC4BBE5B.gpg] https://dl.cloudsmith.io/public/rabbitmq/rabbitmq-erlang/deb/ubuntu jammy main
## Provides RabbitMQ
##
deb [signed-by=/usr/share/keyrings/io.cloudsmith.rabbitmq.9F4587F226208342.gpg] https://dl.cloudsmith.io/public/rabbitmq/rabbitmq-server/deb/ubuntu jammy main
deb-src [signed-by=/usr/share/keyrings/io.cloudsmith.rabbitmq.9F4587F226208342.gpg] https://dl.cloudsmith.io/public/rabbitmq/rabbitmq-server/deb/ubuntu jammy main
EOF

## Update package indices
sudo apt-get update -y

## Install Erlang packages
sudo apt-get install -y erlang-base \
erlang-asn1 erlang-crypto erlang-eldap erlang-ftp erlang-inets \
erlang-mnesia erlang-os-mon erlang-parsetools erlang-public-key \
erlang-runtime-tools erlang-snmp erlang-ssl \
erlang-syntax-tools erlang-tftp erlang-tools erlang-xmerl

## Install rabbitmq-server and its dependencies
sudo apt-get install rabbitmq-server -y --fix-missing
```
To install RabbitMQ on your server, run the script here (and be sure to click the tab on this page for your specific OS version):

https://www.rabbitmq.com/docs/install-debian#apt-quick-start-cloudsmith

Test installation worked by checking `which rabbitmq-server`

Expand Down
6 changes: 5 additions & 1 deletion lib/gear/plotting.py
Original file line number Diff line number Diff line change
Expand Up @@ -549,7 +549,11 @@ def generate_plot(df, x=None, y=None, z=None, facet_row=None, facet_col=None,

if colormap:
# Use black outlines with colormap fillcolor. Pertains mostly to violin plots
new_plotting_args['fillcolor'] = colormap[curr_color]
try:
new_plotting_args['fillcolor'] = colormap[curr_color]
except KeyError as e:
# If color series and colormap differ, skip coloring but still make the plot.
print("ERROR: Series {} not found in passed-in colormap. Skipping.".format(curr_color), file=sys.stderr)

# Now determine which plot this trace should go to. Facet column is first if row does not exist.
# Note the "facet_row/col_indexes" enum command started indexing at 1, so no need to increment for 1-indexed subplots
Expand Down
8 changes: 6 additions & 2 deletions www/api/resources/aggregations.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
import os
import geardb

from .common import get_adata_shadow
from .common import get_adata_shadow, get_adata_from_analysis

class Aggregations(Resource):
"""Resource for retrieving observation aggregations for a dataset and applied categorial observations filters
Expand Down Expand Up @@ -43,10 +43,14 @@ def post(self, dataset_id):
h5_path = ds.get_file_path()

try:
adata = get_adata_shadow(analysis_id, dataset_id, session_id, h5_path)
if not filters:
adata = get_adata_shadow(analysis_id, dataset_id, session_id, h5_path)
else:
adata = get_adata_from_analysis(analysis_id, dataset_id, session_id)
except FileNotFoundError:
return {
"success": -1,
"aggretations": [],
'message': "No h5 file found for this dataset"
}

Expand Down
19 changes: 17 additions & 2 deletions www/js/curator_common.js
Original file line number Diff line number Diff line change
Expand Up @@ -141,15 +141,17 @@ const curatorApiCallsMixin = {
* @returns {Promise<{aggregations: object, total_count: number}>} The fetched aggregations and total count.
*/
async fetchAggregations(datasetId, analysisId, filters){
const errorMsg = "Could not fetch number of observations for this dataset. Please contact the gEAR team.";
try {
const data = await super.fetchAggregations(datasetId, analysisId, filters);
if (data.hasOwnProperty("success") && data.success < 1) {
throw new Error(data?.message || "Could not fetch number of observations for this dataset. Please contact the gEAR team.");
throw new Error(data?.message || errorMsg);
}
const {aggregations, total_count} = data;
return {aggregations, total_count};
} catch (error) {
logErrorInConsole(error);
throw new Error(errorMsg);
}
},

Expand Down Expand Up @@ -761,9 +763,20 @@ const createColorscaleSelectInstance = (idSelector, colorscaleSelect=null) => {
const createFacetWidget = async (datasetId, analysisId, filters) => {
document.getElementById("selected-facets-loader").classList.remove("is-hidden")

const {aggregations, total_count:totalCount} = await curatorApiCallsMixin.fetchAggregations(datasetId, analysisId, filters);
let aggregations = {};
let totalCount = 0;

try {
({aggregations, total_count:totalCount} = await curatorApiCallsMixin.fetchAggregations(datasetId, analysisId, filters));

} catch (error) {
logErrorInConsole(error);
createToast("Could not fetch aggregations. You should still be able to plot.", "is-warning");
return facetWidget;
}
document.getElementById("num-selected").textContent = totalCount;


const facetWidget = new FacetWidget({
aggregations,
filters,
Expand All @@ -775,6 +788,8 @@ const createFacetWidget = async (datasetId, analysisId, filters) => {
document.getElementById("num-selected").textContent = totalCount;
} catch (error) {
logErrorInConsole(error);
createToast("Could not update aggregations. You should still be able to plot.", "is-warning");
return facetWidget
}
} else {
// Save an extra API call
Expand Down
5 changes: 4 additions & 1 deletion www/js/dataset_curator.js
Original file line number Diff line number Diff line change
Expand Up @@ -890,7 +890,10 @@ const curatorSpecifcDatasetTreeCallback = () => {
* @param {string} seriesName - The name of the series.
*/
const curatorSpecifcFacetItemSelectCallback = (seriesName) => {
renderColorPicker(seriesName);
// Update the color picker in case some elements of the color series were filtered out
if(plotStyle.plotConfig?.color_name) {
renderColorPicker(plotStyle.plotConfig.color_name);
}
}

/**
Expand Down

0 comments on commit fb3be64

Please sign in to comment.