Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

♻️📝 API-server files collection cleanup and doc #4645

Merged
merged 5 commits into from
Aug 23, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
59 changes: 37 additions & 22 deletions docs/coding-conventions.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,44 @@
# Coding Conventions and Linters
# Coding Conventions, Linters and Definitions

Some conventions on coding style and tools for the Javascript and Python in this repository.

----
## Javascript

In general the `qooxdoo` naming convention/style is followed. The [Access](http://qooxdoo.org/docs/#/core/oo_feature_summary?id=access) paragraph is the most notable. It is recommended to read the entire document.
## Definitions

What is a ...

- **Controller-Service-Repository** design-pattern ?
- An introduction: https://tom-collings.medium.com/controller-service-repository-16e29a4684e5
- Example of adopted convention: https://github.com/ITISFoundation/osparc-simcore/pull/4389

Have a look at `ESLint`'s configuration files [.eslintrc.json](.eslintrc.json) and [.eslintignore](.eslintignore).


----
## General Coding Conventions

<!-- Add below this line coding agreed coding conventions and give them a number !-->

### CC1: Can I use ``TODO:``, ``FIXME:``?

We should avoid merging PRs with ``TODO:`` and ``FIXME:`` into master. One of our bots detects those and flag them as code-smells. If we still want to keep this idea/fix noted in the code, those can be rewritten as ``NOTE:`` and should be extended with a link to a github issue with more details. For a context, see [discussion here](https://github.com/ITISFoundation/osparc-simcore/pull/3380#discussion_r979893502).


### CC2: No commented code

Avoid commented code, but if you *really* want to keep it then add an explanatory `NOTE:`
```python
import os
# import bar
# x = "not very useful"

# NOTE: I need to keep this becase ...
# import foo
# x = "ok"
```

### CC3 ...

----
## Python

Expand Down Expand Up @@ -62,28 +91,14 @@ In short we use the following naming convention ( roughly [PEP8](https://peps.p
- Recommended inside of a ``scripts`` folder


----
## General

<!-- Add below this line coding agreed coding conventions and give them a number !-->

### CC1: Can I use ``TODO:``, ``FIXME:``?

We should avoid merging PRs with ``TODO:`` and ``FIXME:`` into master. One of our bots detects those and flag them as code-smells. If we still want to keep this idea/fix noted in the code, those can be rewritten as ``NOTE:`` and should be extended with a link to a github issue with more details. For a context, see [discussion here](https://github.com/ITISFoundation/osparc-simcore/pull/3380#discussion_r979893502).


### CC2: No commented code
----
## Javascript

Avoid commented code, but if you *really* want to keep it then add an explanatory `NOTE:`
```python
import os
# import bar
# x = "not very useful"
In general the `qooxdoo` naming convention/style is followed. The [Access](http://qooxdoo.org/docs/#/core/oo_feature_summary?id=access) paragraph is the most notable. It is recommended to read the entire document.

# NOTE: I need to keep this becase ...
# import foo
# x = "ok"
```
Have a look at `ESLint`'s configuration files [.eslintrc.json](.eslintrc.json) and [.eslintignore](.eslintignore).



Expand Down
2 changes: 1 addition & 1 deletion services/api-server/openapi.json
Original file line number Diff line number Diff line change
Expand Up @@ -1570,8 +1570,8 @@
"PUBLISHED",
"NOT_STARTED",
"PENDING",
"WAITING_FOR_RESOURCES",
"STARTED",
"RETRY",
"SUCCESS",
"FAILED",
"ABORTED"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,10 @@
}


@router.get("", response_model=list[File])
@router.get(
"",
response_model=list[File],
)
async def list_files(
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
user_id: Annotated[int, Depends(get_current_user_id)],
Expand All @@ -74,7 +77,7 @@ async def list_files(

file_meta: File = to_file_api_model(stored_file_meta)

except (ValidationError, ValueError, AttributeError) as err:
except (ValidationError, ValueError, AttributeError) as err: # noqa: PERF203
_logger.warning(
"Skipping corrupted entry in storage '%s' (%s)"
"TIP: check this entry in file_meta_data table.",
Expand Down Expand Up @@ -112,7 +115,10 @@ def _get_spooled_file_size(file_io: IO) -> int:
return file_size


@router.put("/content", response_model=File)
@router.put(
"/content",
response_model=File,
)
@cancel_on_disconnect
async def upload_file(
request: Request,
Expand All @@ -129,6 +135,9 @@ async def upload_file(

assert request # nosec

if file.filename is None:
file.filename = "Undefined"

file_size = await asyncio.get_event_loop().run_in_executor(
None, _get_spooled_file_size, file.file
)
Expand All @@ -151,16 +160,20 @@ async def upload_file(
store_id=SIMCORE_LOCATION,
store_name=None,
s3_object=file_meta.storage_file_id,
path_to_upload=UploadableFileObject(file.file, file.filename, file_size),
path_to_upload=UploadableFileObject(
file_object=file.file,
file_name=file.filename,
file_size=file_size,
),
io_log_redirect_cb=None,
)
assert isinstance(upload_result, UploadedFile)
assert isinstance(upload_result, UploadedFile) # nosec

file_meta.checksum = upload_result.etag
return file_meta


# MaG suggested a single function that can upload one or multiple files instead of having
# NOTE: MaG suggested a single function that can upload one or multiple files instead of having
# two of them. Tried something like upload_file( files: Union[list[UploadFile], File] ) but it
# produces an error in the generated openapi.json
#
Expand All @@ -183,16 +196,7 @@ async def get_upload_links(
client_file: ClientFile,
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
):
"""Get upload links for uploading a file to storage
Arguments:
request -- Request object
client_file -- ClientFile object
user_id -- User Id
Returns:
FileUploadSchema
"""
"""Get upload links for uploading a file to storage"""
assert request # nosec
file_meta: File = await File.create_from_client_file(
client_file,
Expand All @@ -208,95 +212,27 @@ async def get_upload_links(
file_size=ByteSize(client_file.filesize),
is_directory=False,
)
query = str(upload_links.links.complete_upload.query).removesuffix(":complete")
complete_url = request.url_for(

query = f"{upload_links.links.complete_upload.query}".removesuffix(":complete")
url = request.url_for(
"complete_multipart_upload", file_id=file_meta.id
).include_query_params(**dict(item.split("=") for item in query.split("&")))
query = str(upload_links.links.abort_upload.query).removesuffix(":abort")
abort_url = request.url_for(
upload_links.links.complete_upload = parse_obj_as(AnyUrl, f"{url}")

query = f"{upload_links.links.abort_upload.query}".removesuffix(":abort")
url = request.url_for(
"abort_multipart_upload", file_id=file_meta.id
).include_query_params(**dict(item.split("=") for item in query.split("&")))
upload_links.links.abort_upload = parse_obj_as(AnyUrl, f"{url}")

upload_links.links.complete_upload = parse_obj_as(AnyUrl, f"{complete_url}")
upload_links.links.abort_upload = parse_obj_as(AnyUrl, f"{abort_url}")
return ClientFileUploadSchema(file_id=file_meta.id, upload_schema=upload_links)


@router.post(
"/{file_id}:complete",
@router.get(
"/{file_id}",
response_model=File,
include_in_schema=API_SERVER_DEV_FEATURES_ENABLED,
)
@cancel_on_disconnect
async def complete_multipart_upload(
request: Request,
file_id: UUID,
client_file: Annotated[ClientFile, Body(...)],
uploaded_parts: Annotated[FileUploadCompletionBody, Body(...)],
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
):
"""Complete multipart upload
Arguments:
request: The Request object
file_id: The Storage id
file: The File object which is to be completed
uploaded_parts: The uploaded parts
completion_link: The completion link
user_id: The user id
Returns:
The completed File object
"""
assert request # nosec
assert user_id # nosec

file: File = File(id=file_id, filename=client_file.filename, checksum=None)
complete_link: URL = await storage_client.generate_complete_upload_link(
file, dict(request.query_params)
)

e_tag: ETag = await complete_file_upload(
uploaded_parts=uploaded_parts.parts,
upload_completion_link=parse_obj_as(AnyUrl, f"{complete_link}"),
)

file.checksum = e_tag
return file


@router.post(
"/{file_id}:abort",
include_in_schema=API_SERVER_DEV_FEATURES_ENABLED,
responses={**_COMMON_ERROR_RESPONSES},
)
@cancel_on_disconnect
async def abort_multipart_upload(
request: Request,
file_id: UUID,
client_file: Annotated[ClientFile, Body(..., embed=True)],
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
):
"""Abort a multipart upload
Arguments:
request: The Request
file_id: The StorageFileID
upload_links: The FileUploadSchema
user_id: The user id
"""
assert request # nosec
assert user_id # nosec
file: File = File(id=file_id, filename=client_file.filename, checksum=None)
abort_link: URL = await storage_client.generate_abort_upload_link(
file, query=dict(request.query_params)
)
await abort_upload(abort_upload_link=parse_obj_as(AnyUrl, str(abort_link)))


@router.get("/{file_id}", response_model=File, responses={**_COMMON_ERROR_RESPONSES})
async def get_file(
file_id: UUID,
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
Expand Down Expand Up @@ -343,6 +279,57 @@ async def delete_file(
raise NotImplementedError(msg)


@router.post(
"/{file_id}:abort",
include_in_schema=API_SERVER_DEV_FEATURES_ENABLED,
)
async def abort_multipart_upload(
request: Request,
file_id: UUID,
client_file: Annotated[ClientFile, Body(..., embed=True)],
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
):
assert request # nosec
assert user_id # nosec
file: File = File(id=file_id, filename=client_file.filename, checksum=None)
abort_link: URL = await storage_client.create_abort_upload_link(
file, query=dict(request.query_params)
)
await abort_upload(abort_upload_link=parse_obj_as(AnyUrl, str(abort_link)))


@router.post(
"/{file_id}:complete",
response_model=File,
include_in_schema=API_SERVER_DEV_FEATURES_ENABLED,
)
@cancel_on_disconnect
async def complete_multipart_upload(
request: Request,
file_id: UUID,
client_file: Annotated[ClientFile, Body(...)],
uploaded_parts: Annotated[FileUploadCompletionBody, Body(...)],
storage_client: Annotated[StorageApi, Depends(get_api_client(StorageApi))],
user_id: Annotated[PositiveInt, Depends(get_current_user_id)],
):
assert request # nosec
assert user_id # nosec

file: File = File(id=file_id, filename=client_file.filename, checksum=None)
complete_link: URL = await storage_client.create_complete_upload_link(
file, dict(request.query_params)
)

e_tag: ETag = await complete_file_upload(
uploaded_parts=uploaded_parts.parts,
upload_completion_link=parse_obj_as(AnyUrl, f"{complete_link}"),
)

file.checksum = e_tag
return file


@router.get(
"/{file_id}/content",
response_class=RedirectResponse,
Expand Down Expand Up @@ -370,7 +357,9 @@ async def download_file(

# download from S3 using pre-signed link
presigned_download_link = await storage_client.get_download_link(
user_id, file_meta.id, file_meta.filename
user_id=user_id,
file_id=file_meta.id,
file_name=file_meta.filename,
)

_logger.info("Downloading %s to %s ...", file_meta, presigned_download_link)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,13 +32,13 @@
from ...models.schemas.solvers import Solver, SolverKeyId
from ...services.catalog import CatalogApi
from ...services.director_v2 import DirectorV2Api, DownloadLink, NodeName
from ...services.storage import StorageApi, to_file_api_model
from ...utils.solver_job_models_converters import (
from ...services.solver_job_models_converters import (
create_job_from_project,
create_jobstatus_from_task,
create_new_project_for_job,
)
from ...utils.solver_job_outputs import ResultsTypes, get_solver_output_results
from ...services.solver_job_outputs import ResultsTypes, get_solver_output_results
from ...services.storage import StorageApi, to_file_api_model
from ..dependencies.application import get_product_name, get_reverse_url_mapper
from ..dependencies.authentication import get_current_user_id
from ..dependencies.database import Engine, get_db_engine
Expand Down Expand Up @@ -127,11 +127,7 @@ async def get_jobs_page(
url_for: Annotated[Callable, Depends(get_reverse_url_mapper)],
product_name: Annotated[str, Depends(get_product_name)],
):
"""List of jobs on a specific released solver (includes pagination)
Breaking change in *version 0.5*: response model changed from list[Job] to pagination Page[Job].
"""
"""List of jobs on a specific released solver (includes pagination)"""

# NOTE: Different entry to keep backwards compatibility with list_jobs.
# Eventually use a header with agent version to switch to new interface
Expand Down
Loading