diff --git a/2.2.2/404.html b/2.2.2/404.html deleted file mode 100644 index 9bbb18a..0000000 --- a/2.2.2/404.html +++ /dev/null @@ -1,460 +0,0 @@ - - - - - - - - - - - - - - - - - - - Projet ROK4 - Librairies python - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
-
- -
- - - - - - - - -
- - - - - - - -
- -
- - - - -
-
- - - -
-
-
- - - - - - - -
-
-
- - - -
-
-
- - - -
-
-
- - - -
-
- -

404 - Not found

- -
-
- - - -
- -
- - - -
-
-
-
- - - - - - - - - - \ No newline at end of file diff --git a/2.2.2/CHANGELOG/index.html b/2.2.2/CHANGELOG/index.html deleted file mode 100644 index 4446f9a..0000000 --- a/2.2.2/CHANGELOG/index.html +++ /dev/null @@ -1,1788 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - Historique des versions - Projet ROK4 - Librairies python - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - Aller au contenu - - -
-
- -
- - - - - - - - -
- - - - - - - -
- -
- - - - -
-
- - - - - - - - - - - -
-
-
- - - -
-
-
- - - -
-
- - - - -

Historique des versions

- -

2.2.2

-

[Changed]

-
    -
  • Module storage : il est possible de l'utiliser sans avoir la librairie GDAL : seule la fonction get_osgeo_path pour du S3 ne sera pas disponible
  • -
-

2.2.0

-

[Added]

-
    -
  • Ajout de la librairie de gestion d'un style ROK4
  • -
-

2.1.5

-

[Changed]

-
    -
  • Pyramid : la fonction de chargement de la liste en mémoire retourne le nombre de dalle
  • -
-

2.1.4

-

[Fixed]

-
    -
  • Storage : la réponse à un HEAD (test existence en S3) donne un code 404 et non NoSuchKey (confusion avec la lecture d'objet)
  • -
  • RasterSet: le chargement d'un raster set à partir d'un fichier ou d'un descripteur utilise la librairie Storage et non la librairie GDAL
  • -
-

2.1.3

-

[Fixed]

-
    -
  • Storage : dans le cas d'une lecture ou d'un test existence sur un objet S3 absent, le code dans la réponse n'est pas 404 mais NoSuchKey
  • -
-

2.1.0

-

[Added]

-
    -
  • Pyramid
      -
    • Ajout de fonctions pour récupérer la tile_limits et le nombre de canaux de cette pyramide
    • -
    • Ajout de fonctions pour ajouter ou supprimer des niveaux dans une pyramide
    • -
    -
  • -
  • TileMatrixSet
      -
    • Ajout de fonctions pour récupérer la hauteur et la largeur de tuiles d'un TileMatrixSet
    • -
    -
  • -
-

[Changed]

-
    -
  • Pyramid
      -
    • Ajout d'un paramètre optionnel "mask" pour le constructeur from other afin de pouvoir conserver ou non les masques de la pyramide servant de base à la nouvellle
    • -
    -
  • -
  • Gestion des documentations des différentes versions avec l'outil mike
  • -
-

2.0.1

-

[Added]

-
    -
  • storage : le cache de lecture est configurable en taille (avec ROK4_READING_LRU_CACHE_SIZE) et en temps de rétention (avec ROK4_READING_LRU_CACHE_TTL)
  • -
-

[Security]

-
    -
  • Montée de version de pillow (faille de sécurité liée à libwebp)
  • -
-

2.0.0

-

[Fixed]

-
    -
  • Pyramid
      -
    • quand on lit une tuile dans une pyramide PNG 1 canal, on retourne bien aussi un numpy.array à 3 dimensions (la dernière dimension sera bien un array à un élément)
    • -
    -
  • -
-

[Changed]

-
    -
  • Storage
      -
    • Le client S3 garde ouverte des connexions
    • -
    • La fonction get_data_binary a un système de cache de type LRU, avec un temps de validité de 5 minutes
    • -
    -
  • -
-

1.7.1

-

[Added]

-
    -
  • -

    Raster

    -
      -
    • Classe RasterSet, réprésentant une collection d'objets de la classe Raster, avec des informations supplémentaires
    • -
    • Méthodes d'import et export des informations extraites par une instance RasterSet, au travers d'un descripteur (fichier ou objet json, voire sortie standard)
    • -
    • Documentation interne
    • -
    • Tests unitaires pour la classe RasterSet
    • -
    • Classe Raster : constructeur à partir des paramètres
    • -
    -
  • -
  • -

    Pyramid

    -
      -
    • Fonction de calcul de la taille d'une pyramide
    • -
    • Générateur de lecture de la liste du contenu
    • -
    -
  • -
  • -

    Storage

    -
      -
    • Fonction de calcul de la taille des fichiers d'un chemin selon le stockage
    • -
    • Ajout de la copie de HTTP vers FILE/S3/CEPH
    • -
    • Ajout de la fonction de lecture d'un fichier HTTP, de l'existence d'un fichier HTTP et du calcul de taille d'un fichier HTTP
    • -
    -
  • -
-

[Changed]

-
    -
  • Raster
      -
    • Homogénéisation du code
    • -
    • Mise en conformité PEP-8
    • -
    -
  • -
  • test_Raster
      -
    • Homogénéisation du code
    • -
    • Mise en conformité PEP-8
    • -
    -
  • -
  • Utils
      -
    • Mise en conformité PEP-8 des fonctions compute_bbox et compute_format
    • -
    -
  • -
-

[Fixed]

-
    -
  • Utils
      -
    • Correction d'un nom de variable dans la fonction compute_format, qui écrasait une fonction du noyau python.
    • -
    -
  • -
-

1.6.0

-

Lecture par système de fichier virtuel avec GDAL

-

[Added]

-
    -
  • Storage
      -
    • Fonction get_osgeo_path permettant de configurer le bon sytème de fichier virtuel en fonction du chemin fourni, et retourne celui à utiliser dans le Open de gdal ou ogr
    • -
    -
  • -
-

[Changed]

-
    -
  • Storage
      -
    • la récupération d'un client S3 (__get_s3_client) permet de récupérer le client, l'hôte, les clés d'accès et secrète, ainsi que le nom du bucket sans l'éventuel hôte du cluster
    • -
    -
  • -
-

[Fixed]

-
    -
  • Storage
      -
    • Lecture binaire S3 : mauvaise configuration du nom du bucket et de l'objet et mauvaise lecture partielle
    • -
    -
  • -
-

[Removed]

-
    -
  • Exceptions
      -
    • NotImplementedError est une exceptions native
    • -
    -
  • -
-

1.5.0

-

[Added]

-
    -
  • Level
      -
    • Fonction de test d'une tuile is_in_limits : ses indices sont ils dans les limites du niveau ?
    • -
    -
  • -
  • Pyramid
      -
    • La lecture d'une tuile vérifie avant que les indices sont bien dans les limites du niveau
    • -
    • Les exceptions levées lors du décodage de la tuile raster emettent une exception FormatError
    • -
    • get_tile_indices accepte en entrée un système de coordonnées : c'est celui des coordonnées fournies et permet de faire une reprojection si celui ci n'est pas le même que celui des données dans la pyramide
    • -
    -
  • -
  • Utils
      -
    • Meilleure gestion de reprojection par reproject_bbox : on détecte des systèmes identiques en entrée ou quand seul l'ordre des axes changent, pour éviter le calcul
    • -
    • Ajout de la fonction de reprojection d'un point reproject_point : on détecte des systèmes identiques en entrée ou quand seul l'ordre des axes changent, pour éviter le calcul
    • -
    -
  • -
-

[Changed]

-
    -
  • Utils :
      -
    • bbox_to_geometry : on ne fournit plus de système de coordonnées, la fonction se content de créer la géométrie OGR à partir de la bbox, avec éventuellement une densification en points des bords
    • -
    -
  • -
  • Pyramid :
      -
    • Renommage de fonction : update_limits -> set_limits_from_bbox. Le but est d'être plus explicite sur le fonctionnement de la fonction (on écrase les limites, on ne les met pas juste à jour par union avec la bbox fournie)
    • -
    -
  • -
-

1.4.4

-

Ajout de fonctionnalités de lecture de donnée d'une pyramide et suivi des recommandations PyPA pour la gestion du projet.

-

[Added]

-
    -
  • TileMatrix :
      -
    • Fonction de calcul des indices de tuile et de pixel dans la tuile à partir d'un point dans le système de coordonnées du TMS
    • -
    -
  • -
  • Pyramid :
      -
    • Fonction de calcul des indices de tuile et de pixel dans la tuile à partir d'un point dans le système de coordonnées du TMS et éventuellement un niveau
    • -
    • Fonctions de lecture d'une tuile : au format binaire source ou au format tableau à 3 dimensions pour les tuiles raster
    • -
    -
  • -
  • Storage :
      -
    • Fonction de lecture binaire, complète ou partielle, d'un fichier ou objet S3 ou CEPH
    • -
    -
  • -
  • -

    Exceptions : NotImplementedError permet de préciser qu'une fonctionnalité n'a pas été implémentée pour tous les cas. Ici, on ne gère pas la décompression des données raster pour les compressions packbit et LZW

    -
  • -
  • -

    Ajout de la publication PyPI dans la CI GitHub

    -
  • -
-

[Changed]

-
    -
  • Storage :
      -
    • La lecture sous forme de chaîne s'appuie sur la lecture complète binaire. Aucun changement à l'usage.
    • -
    -
  • -
  • -

    TileMatrixSet : quelque soit le système de coordonnées, on ne gère que un ordre des axes X,Y ou Lon,Lat. Cependant, les fonctions de calcul de ou à partir de bbox respectent l'ordre du système dans ces dernières.

    -
  • -
  • -

    Passage de la configuration du projet dans le fichier pyproject.toml

    -
  • -
-

1.3.0

-

Ajout de la librairie de lecture de données vecteur, de tests unitaires et ajout de fonctionnalité pour le stockage. Amélioration de la gestion du projet et de l'intégration continue.

-

[Added]

-
    -
  • Librairie de lecture de données vecteur :
  • -
  • Chargement de données vecteur pour des fichiers shapefile, Geopackage, CSV et GeoJSON
  • -
  • Ecriture des tests unitaires
  • -
  • Librairie Pyramid : complétion des tests unitaires
  • -
  • Librairie Storage : prise en charge de la copie CEPH -> S3
  • -
  • Gestion du projet (compilations, dépendances...) via poetry
  • -
  • Injection de la version dans le fichier pyproject.toml et __init__.py (définition de la variable __version__)
  • -
  • Évolution de la CI github
      -
    • Vérification des installations et tests unitaires sous ubuntu 20.04 python 3.8 et ubuntu 22.04 python 3.10
    • -
    • Publication de l'artefact avec les résultats des tests unitaires
    • -
    • Nettoyage de la release en cas d'erreur
    • -
    • Compilation de la documentation et publication sur la branche gh-pages
    • -
    -
  • -
-

1.2.0

-

Ajout des librairies pour l'utilitaire make-layer.py

-

[Added]

-
    -
  • -

    Librairie Storage : complétion des tests unitaires

    -
  • -
  • -

    Librairie Pyramid :

    -
  • -
  • -

    Ajout de getter sur les niveaux du haut et du bas

    -
  • -
  • -

    Ajout de la librairie de gestion d'une couche Layer :

    -
  • -
  • Chargement d'une couche depuis des paramètres
  • -
  • Chargement d'une couche depuis un descripteur
  • -
  • Écriture du descripteur au format attendu par le serveur
  • -
  • -

    Écriture des tests unitaires

    -
  • -
  • -

    Ajout d'une librairie d'utilitaires Utils

    -
  • -
  • Conversion d'un SRS en objet OSR SpatialReference
  • -
  • Conversion d'une bbox en objet OGR Geometry
  • -
  • Reprojection d'une bbox avec densification des côtés et reprojection partielle
  • -
  • -

    Écriture des tests unitaires

    -
  • -
  • -

    Configuration de l'outil coverage pour voir la couverture des tests unitaires

    -
  • -
-

1.1.0

-

Prise en charge de plusieurs clusters S3 de stockage.

-

[Added]

-
    -
  • Librairie d'abstraction du stockage :
  • -
  • Prise en charge de plusieurs clusters S3. Les variables d'environnement pour le stockage S3 précisent plusieurs valeurs séparées par des virgules, et les noms des buckets peuvent être suffixés par "@{S3 cluster host}". Par défaut, le premier cluster défini est utilisé. L'hôte du cluster n'est jamais écrit dans le descripteur de pyramide ou le fichier liste (puisque stockés sur le cluster, on sait sur lequel sont les objets). Les objets symboliques ne le précisent pas non plus et ne peuvent être qu'au sein d'un cluster S3
  • -
-

1.0.0

-

Initialisation des librairies Python utilisées par les outils python à venir du dépôt pytools.

-

[Added]

-
    -
  • Librairie d'abstraction du stockage (S3, CEPH ou FILE)
  • -
  • récupération du contenu sous forme de string
  • -
  • écriture d'un contenu string
  • -
  • création d'un lien symbolique
  • -
  • copie fichier/objet <-> fichier/objet
  • -
  • Librairie de chargement d'un Tile Matrix Set
  • -
  • Librairie de gestion d'un descripteur de pyramide
  • -
  • chargement depuis un descripteur ou par clone (avec changement de stockage)
  • -
  • écriture du descripteur
  • -
  • Tests unitaires couvrant ces librairies
  • -
- - - - - - - - - - - - - - - - - - -
-
- - - -
- -
- - - -
-
-
-
- - - - - - - - - - \ No newline at end of file diff --git a/2.2.2/CONTRIBUTING/index.html b/2.2.2/CONTRIBUTING/index.html deleted file mode 100644 index 4b2debf..0000000 --- a/2.2.2/CONTRIBUTING/index.html +++ /dev/null @@ -1,613 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - Contribuer - Projet ROK4 - Librairies python - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - Aller au contenu - - -
-
- -
- - - - - - - - -
- - - - - - - -
- -
- - - - -
-
- - - - - - - - - - - -
-
-
- - - -
-
-
- - - -
-
- - - - -

Directives de contribution

-

Merci d'envisager de contribuer à ce projet !

-

Git hooks

-

Nous utilisons les git hooks via pre-commit pour appliquer et vérifier automatiquement certaines "règles". Veuillez l'installer avant de pousser un commit.

-

Voir le fichier de configuration correspondant : .pre-commit-config.yaml.

-

Pull request

-

Le titre de la PR est utilisé pour constituer automatiquement les notes de release. Vous pouvez préciser en commentaire de votre PR des détails qui seront ajoutés dans le fichier CHANGELOG.md par les mainteneurs du projet.

-

Le formalisme du changelog est le suivant, en markdown :

-
### [Added]
-
-Liste de nouvelles fonctionnalités.
-
-### [Changed]
-
-Liste de fonctionnalités existantes modifiées.
-
-### [Deprecated]
-
-Liste de fonctionnalités dépréciées.
-
-### [Removed]
-
-Liste de foncitonnalités retirées.
-
-### [Fixed]
-
-Liste de corrections fonctionnelles.
-
-### [Security]
-
-Liste de corrections de sécurité.
-
-

Les parties vides, sans élément à lister, peuvent être ignorées.

- - - - - - - - - - - - - - - - - - -
-
- - - -
- -
- - - -
-
-
-
- - - - - - - - - - \ No newline at end of file diff --git a/2.2.2/assets/images/favicon.png b/2.2.2/assets/images/favicon.png deleted file mode 100644 index 1cf13b9..0000000 Binary files a/2.2.2/assets/images/favicon.png and /dev/null differ diff --git a/2.2.2/assets/javascripts/bundle.525ec568.min.js b/2.2.2/assets/javascripts/bundle.525ec568.min.js deleted file mode 100644 index 4b08eae..0000000 --- a/2.2.2/assets/javascripts/bundle.525ec568.min.js +++ /dev/null @@ -1,16 +0,0 @@ -"use strict";(()=>{var Wi=Object.create;var gr=Object.defineProperty;var Di=Object.getOwnPropertyDescriptor;var Vi=Object.getOwnPropertyNames,Vt=Object.getOwnPropertySymbols,Ni=Object.getPrototypeOf,yr=Object.prototype.hasOwnProperty,ao=Object.prototype.propertyIsEnumerable;var io=(e,t,r)=>t in e?gr(e,t,{enumerable:!0,configurable:!0,writable:!0,value:r}):e[t]=r,$=(e,t)=>{for(var r in t||(t={}))yr.call(t,r)&&io(e,r,t[r]);if(Vt)for(var r of Vt(t))ao.call(t,r)&&io(e,r,t[r]);return e};var so=(e,t)=>{var r={};for(var o in e)yr.call(e,o)&&t.indexOf(o)<0&&(r[o]=e[o]);if(e!=null&&Vt)for(var o of Vt(e))t.indexOf(o)<0&&ao.call(e,o)&&(r[o]=e[o]);return r};var xr=(e,t)=>()=>(t||e((t={exports:{}}).exports,t),t.exports);var zi=(e,t,r,o)=>{if(t&&typeof t=="object"||typeof t=="function")for(let n of Vi(t))!yr.call(e,n)&&n!==r&&gr(e,n,{get:()=>t[n],enumerable:!(o=Di(t,n))||o.enumerable});return e};var Mt=(e,t,r)=>(r=e!=null?Wi(Ni(e)):{},zi(t||!e||!e.__esModule?gr(r,"default",{value:e,enumerable:!0}):r,e));var co=(e,t,r)=>new Promise((o,n)=>{var i=p=>{try{s(r.next(p))}catch(c){n(c)}},a=p=>{try{s(r.throw(p))}catch(c){n(c)}},s=p=>p.done?o(p.value):Promise.resolve(p.value).then(i,a);s((r=r.apply(e,t)).next())});var lo=xr((Er,po)=>{(function(e,t){typeof Er=="object"&&typeof po!="undefined"?t():typeof define=="function"&&define.amd?define(t):t()})(Er,function(){"use strict";function e(r){var o=!0,n=!1,i=null,a={text:!0,search:!0,url:!0,tel:!0,email:!0,password:!0,number:!0,date:!0,month:!0,week:!0,time:!0,datetime:!0,"datetime-local":!0};function s(k){return!!(k&&k!==document&&k.nodeName!=="HTML"&&k.nodeName!=="BODY"&&"classList"in k&&"contains"in k.classList)}function p(k){var ft=k.type,qe=k.tagName;return!!(qe==="INPUT"&&a[ft]&&!k.readOnly||qe==="TEXTAREA"&&!k.readOnly||k.isContentEditable)}function c(k){k.classList.contains("focus-visible")||(k.classList.add("focus-visible"),k.setAttribute("data-focus-visible-added",""))}function l(k){k.hasAttribute("data-focus-visible-added")&&(k.classList.remove("focus-visible"),k.removeAttribute("data-focus-visible-added"))}function f(k){k.metaKey||k.altKey||k.ctrlKey||(s(r.activeElement)&&c(r.activeElement),o=!0)}function u(k){o=!1}function d(k){s(k.target)&&(o||p(k.target))&&c(k.target)}function y(k){s(k.target)&&(k.target.classList.contains("focus-visible")||k.target.hasAttribute("data-focus-visible-added"))&&(n=!0,window.clearTimeout(i),i=window.setTimeout(function(){n=!1},100),l(k.target))}function L(k){document.visibilityState==="hidden"&&(n&&(o=!0),X())}function X(){document.addEventListener("mousemove",J),document.addEventListener("mousedown",J),document.addEventListener("mouseup",J),document.addEventListener("pointermove",J),document.addEventListener("pointerdown",J),document.addEventListener("pointerup",J),document.addEventListener("touchmove",J),document.addEventListener("touchstart",J),document.addEventListener("touchend",J)}function te(){document.removeEventListener("mousemove",J),document.removeEventListener("mousedown",J),document.removeEventListener("mouseup",J),document.removeEventListener("pointermove",J),document.removeEventListener("pointerdown",J),document.removeEventListener("pointerup",J),document.removeEventListener("touchmove",J),document.removeEventListener("touchstart",J),document.removeEventListener("touchend",J)}function J(k){k.target.nodeName&&k.target.nodeName.toLowerCase()==="html"||(o=!1,te())}document.addEventListener("keydown",f,!0),document.addEventListener("mousedown",u,!0),document.addEventListener("pointerdown",u,!0),document.addEventListener("touchstart",u,!0),document.addEventListener("visibilitychange",L,!0),X(),r.addEventListener("focus",d,!0),r.addEventListener("blur",y,!0),r.nodeType===Node.DOCUMENT_FRAGMENT_NODE&&r.host?r.host.setAttribute("data-js-focus-visible",""):r.nodeType===Node.DOCUMENT_NODE&&(document.documentElement.classList.add("js-focus-visible"),document.documentElement.setAttribute("data-js-focus-visible",""))}if(typeof window!="undefined"&&typeof document!="undefined"){window.applyFocusVisiblePolyfill=e;var t;try{t=new CustomEvent("focus-visible-polyfill-ready")}catch(r){t=document.createEvent("CustomEvent"),t.initCustomEvent("focus-visible-polyfill-ready",!1,!1,{})}window.dispatchEvent(t)}typeof document!="undefined"&&e(document)})});var qr=xr((hy,On)=>{"use strict";/*! - * escape-html - * Copyright(c) 2012-2013 TJ Holowaychuk - * Copyright(c) 2015 Andreas Lubbe - * Copyright(c) 2015 Tiancheng "Timothy" Gu - * MIT Licensed - */var $a=/["'&<>]/;On.exports=Pa;function Pa(e){var t=""+e,r=$a.exec(t);if(!r)return t;var o,n="",i=0,a=0;for(i=r.index;i{/*! - * clipboard.js v2.0.11 - * https://clipboardjs.com/ - * - * Licensed MIT © Zeno Rocha - */(function(t,r){typeof It=="object"&&typeof Yr=="object"?Yr.exports=r():typeof define=="function"&&define.amd?define([],r):typeof It=="object"?It.ClipboardJS=r():t.ClipboardJS=r()})(It,function(){return function(){var e={686:function(o,n,i){"use strict";i.d(n,{default:function(){return Ui}});var a=i(279),s=i.n(a),p=i(370),c=i.n(p),l=i(817),f=i.n(l);function u(V){try{return document.execCommand(V)}catch(A){return!1}}var d=function(A){var M=f()(A);return u("cut"),M},y=d;function L(V){var A=document.documentElement.getAttribute("dir")==="rtl",M=document.createElement("textarea");M.style.fontSize="12pt",M.style.border="0",M.style.padding="0",M.style.margin="0",M.style.position="absolute",M.style[A?"right":"left"]="-9999px";var F=window.pageYOffset||document.documentElement.scrollTop;return M.style.top="".concat(F,"px"),M.setAttribute("readonly",""),M.value=V,M}var X=function(A,M){var F=L(A);M.container.appendChild(F);var D=f()(F);return u("copy"),F.remove(),D},te=function(A){var M=arguments.length>1&&arguments[1]!==void 0?arguments[1]:{container:document.body},F="";return typeof A=="string"?F=X(A,M):A instanceof HTMLInputElement&&!["text","search","url","tel","password"].includes(A==null?void 0:A.type)?F=X(A.value,M):(F=f()(A),u("copy")),F},J=te;function k(V){"@babel/helpers - typeof";return typeof Symbol=="function"&&typeof Symbol.iterator=="symbol"?k=function(M){return typeof M}:k=function(M){return M&&typeof Symbol=="function"&&M.constructor===Symbol&&M!==Symbol.prototype?"symbol":typeof M},k(V)}var ft=function(){var A=arguments.length>0&&arguments[0]!==void 0?arguments[0]:{},M=A.action,F=M===void 0?"copy":M,D=A.container,Y=A.target,$e=A.text;if(F!=="copy"&&F!=="cut")throw new Error('Invalid "action" value, use either "copy" or "cut"');if(Y!==void 0)if(Y&&k(Y)==="object"&&Y.nodeType===1){if(F==="copy"&&Y.hasAttribute("disabled"))throw new Error('Invalid "target" attribute. Please use "readonly" instead of "disabled" attribute');if(F==="cut"&&(Y.hasAttribute("readonly")||Y.hasAttribute("disabled")))throw new Error(`Invalid "target" attribute. You can't cut text from elements with "readonly" or "disabled" attributes`)}else throw new Error('Invalid "target" value, use a valid Element');if($e)return J($e,{container:D});if(Y)return F==="cut"?y(Y):J(Y,{container:D})},qe=ft;function Fe(V){"@babel/helpers - typeof";return typeof Symbol=="function"&&typeof Symbol.iterator=="symbol"?Fe=function(M){return typeof M}:Fe=function(M){return M&&typeof Symbol=="function"&&M.constructor===Symbol&&M!==Symbol.prototype?"symbol":typeof M},Fe(V)}function ki(V,A){if(!(V instanceof A))throw new TypeError("Cannot call a class as a function")}function no(V,A){for(var M=0;M0&&arguments[0]!==void 0?arguments[0]:{};this.action=typeof D.action=="function"?D.action:this.defaultAction,this.target=typeof D.target=="function"?D.target:this.defaultTarget,this.text=typeof D.text=="function"?D.text:this.defaultText,this.container=Fe(D.container)==="object"?D.container:document.body}},{key:"listenClick",value:function(D){var Y=this;this.listener=c()(D,"click",function($e){return Y.onClick($e)})}},{key:"onClick",value:function(D){var Y=D.delegateTarget||D.currentTarget,$e=this.action(Y)||"copy",Dt=qe({action:$e,container:this.container,target:this.target(Y),text:this.text(Y)});this.emit(Dt?"success":"error",{action:$e,text:Dt,trigger:Y,clearSelection:function(){Y&&Y.focus(),window.getSelection().removeAllRanges()}})}},{key:"defaultAction",value:function(D){return vr("action",D)}},{key:"defaultTarget",value:function(D){var Y=vr("target",D);if(Y)return document.querySelector(Y)}},{key:"defaultText",value:function(D){return vr("text",D)}},{key:"destroy",value:function(){this.listener.destroy()}}],[{key:"copy",value:function(D){var Y=arguments.length>1&&arguments[1]!==void 0?arguments[1]:{container:document.body};return J(D,Y)}},{key:"cut",value:function(D){return y(D)}},{key:"isSupported",value:function(){var D=arguments.length>0&&arguments[0]!==void 0?arguments[0]:["copy","cut"],Y=typeof D=="string"?[D]:D,$e=!!document.queryCommandSupported;return Y.forEach(function(Dt){$e=$e&&!!document.queryCommandSupported(Dt)}),$e}}]),M}(s()),Ui=Fi},828:function(o){var n=9;if(typeof Element!="undefined"&&!Element.prototype.matches){var i=Element.prototype;i.matches=i.matchesSelector||i.mozMatchesSelector||i.msMatchesSelector||i.oMatchesSelector||i.webkitMatchesSelector}function a(s,p){for(;s&&s.nodeType!==n;){if(typeof s.matches=="function"&&s.matches(p))return s;s=s.parentNode}}o.exports=a},438:function(o,n,i){var a=i(828);function s(l,f,u,d,y){var L=c.apply(this,arguments);return l.addEventListener(u,L,y),{destroy:function(){l.removeEventListener(u,L,y)}}}function p(l,f,u,d,y){return typeof l.addEventListener=="function"?s.apply(null,arguments):typeof u=="function"?s.bind(null,document).apply(null,arguments):(typeof l=="string"&&(l=document.querySelectorAll(l)),Array.prototype.map.call(l,function(L){return s(L,f,u,d,y)}))}function c(l,f,u,d){return function(y){y.delegateTarget=a(y.target,f),y.delegateTarget&&d.call(l,y)}}o.exports=p},879:function(o,n){n.node=function(i){return i!==void 0&&i instanceof HTMLElement&&i.nodeType===1},n.nodeList=function(i){var a=Object.prototype.toString.call(i);return i!==void 0&&(a==="[object NodeList]"||a==="[object HTMLCollection]")&&"length"in i&&(i.length===0||n.node(i[0]))},n.string=function(i){return typeof i=="string"||i instanceof String},n.fn=function(i){var a=Object.prototype.toString.call(i);return a==="[object Function]"}},370:function(o,n,i){var a=i(879),s=i(438);function p(u,d,y){if(!u&&!d&&!y)throw new Error("Missing required arguments");if(!a.string(d))throw new TypeError("Second argument must be a String");if(!a.fn(y))throw new TypeError("Third argument must be a Function");if(a.node(u))return c(u,d,y);if(a.nodeList(u))return l(u,d,y);if(a.string(u))return f(u,d,y);throw new TypeError("First argument must be a String, HTMLElement, HTMLCollection, or NodeList")}function c(u,d,y){return u.addEventListener(d,y),{destroy:function(){u.removeEventListener(d,y)}}}function l(u,d,y){return Array.prototype.forEach.call(u,function(L){L.addEventListener(d,y)}),{destroy:function(){Array.prototype.forEach.call(u,function(L){L.removeEventListener(d,y)})}}}function f(u,d,y){return s(document.body,u,d,y)}o.exports=p},817:function(o){function n(i){var a;if(i.nodeName==="SELECT")i.focus(),a=i.value;else if(i.nodeName==="INPUT"||i.nodeName==="TEXTAREA"){var s=i.hasAttribute("readonly");s||i.setAttribute("readonly",""),i.select(),i.setSelectionRange(0,i.value.length),s||i.removeAttribute("readonly"),a=i.value}else{i.hasAttribute("contenteditable")&&i.focus();var p=window.getSelection(),c=document.createRange();c.selectNodeContents(i),p.removeAllRanges(),p.addRange(c),a=p.toString()}return a}o.exports=n},279:function(o){function n(){}n.prototype={on:function(i,a,s){var p=this.e||(this.e={});return(p[i]||(p[i]=[])).push({fn:a,ctx:s}),this},once:function(i,a,s){var p=this;function c(){p.off(i,c),a.apply(s,arguments)}return c._=a,this.on(i,c,s)},emit:function(i){var a=[].slice.call(arguments,1),s=((this.e||(this.e={}))[i]||[]).slice(),p=0,c=s.length;for(p;p0&&i[i.length-1])&&(c[0]===6||c[0]===2)){r=0;continue}if(c[0]===3&&(!i||c[1]>i[0]&&c[1]=e.length&&(e=void 0),{value:e&&e[o++],done:!e}}};throw new TypeError(t?"Object is not iterable.":"Symbol.iterator is not defined.")}function N(e,t){var r=typeof Symbol=="function"&&e[Symbol.iterator];if(!r)return e;var o=r.call(e),n,i=[],a;try{for(;(t===void 0||t-- >0)&&!(n=o.next()).done;)i.push(n.value)}catch(s){a={error:s}}finally{try{n&&!n.done&&(r=o.return)&&r.call(o)}finally{if(a)throw a.error}}return i}function q(e,t,r){if(r||arguments.length===2)for(var o=0,n=t.length,i;o1||p(d,L)})},y&&(n[d]=y(n[d])))}function p(d,y){try{c(o[d](y))}catch(L){u(i[0][3],L)}}function c(d){d.value instanceof nt?Promise.resolve(d.value.v).then(l,f):u(i[0][2],d)}function l(d){p("next",d)}function f(d){p("throw",d)}function u(d,y){d(y),i.shift(),i.length&&p(i[0][0],i[0][1])}}function uo(e){if(!Symbol.asyncIterator)throw new TypeError("Symbol.asyncIterator is not defined.");var t=e[Symbol.asyncIterator],r;return t?t.call(e):(e=typeof he=="function"?he(e):e[Symbol.iterator](),r={},o("next"),o("throw"),o("return"),r[Symbol.asyncIterator]=function(){return this},r);function o(i){r[i]=e[i]&&function(a){return new Promise(function(s,p){a=e[i](a),n(s,p,a.done,a.value)})}}function n(i,a,s,p){Promise.resolve(p).then(function(c){i({value:c,done:s})},a)}}function H(e){return typeof e=="function"}function ut(e){var t=function(o){Error.call(o),o.stack=new Error().stack},r=e(t);return r.prototype=Object.create(Error.prototype),r.prototype.constructor=r,r}var zt=ut(function(e){return function(r){e(this),this.message=r?r.length+` errors occurred during unsubscription: -`+r.map(function(o,n){return n+1+") "+o.toString()}).join(` - `):"",this.name="UnsubscriptionError",this.errors=r}});function Qe(e,t){if(e){var r=e.indexOf(t);0<=r&&e.splice(r,1)}}var Ue=function(){function e(t){this.initialTeardown=t,this.closed=!1,this._parentage=null,this._finalizers=null}return e.prototype.unsubscribe=function(){var t,r,o,n,i;if(!this.closed){this.closed=!0;var a=this._parentage;if(a)if(this._parentage=null,Array.isArray(a))try{for(var s=he(a),p=s.next();!p.done;p=s.next()){var c=p.value;c.remove(this)}}catch(L){t={error:L}}finally{try{p&&!p.done&&(r=s.return)&&r.call(s)}finally{if(t)throw t.error}}else a.remove(this);var l=this.initialTeardown;if(H(l))try{l()}catch(L){i=L instanceof zt?L.errors:[L]}var f=this._finalizers;if(f){this._finalizers=null;try{for(var u=he(f),d=u.next();!d.done;d=u.next()){var y=d.value;try{ho(y)}catch(L){i=i!=null?i:[],L instanceof zt?i=q(q([],N(i)),N(L.errors)):i.push(L)}}}catch(L){o={error:L}}finally{try{d&&!d.done&&(n=u.return)&&n.call(u)}finally{if(o)throw o.error}}}if(i)throw new zt(i)}},e.prototype.add=function(t){var r;if(t&&t!==this)if(this.closed)ho(t);else{if(t instanceof e){if(t.closed||t._hasParent(this))return;t._addParent(this)}(this._finalizers=(r=this._finalizers)!==null&&r!==void 0?r:[]).push(t)}},e.prototype._hasParent=function(t){var r=this._parentage;return r===t||Array.isArray(r)&&r.includes(t)},e.prototype._addParent=function(t){var r=this._parentage;this._parentage=Array.isArray(r)?(r.push(t),r):r?[r,t]:t},e.prototype._removeParent=function(t){var r=this._parentage;r===t?this._parentage=null:Array.isArray(r)&&Qe(r,t)},e.prototype.remove=function(t){var r=this._finalizers;r&&Qe(r,t),t instanceof e&&t._removeParent(this)},e.EMPTY=function(){var t=new e;return t.closed=!0,t}(),e}();var Tr=Ue.EMPTY;function qt(e){return e instanceof Ue||e&&"closed"in e&&H(e.remove)&&H(e.add)&&H(e.unsubscribe)}function ho(e){H(e)?e():e.unsubscribe()}var Pe={onUnhandledError:null,onStoppedNotification:null,Promise:void 0,useDeprecatedSynchronousErrorHandling:!1,useDeprecatedNextContext:!1};var dt={setTimeout:function(e,t){for(var r=[],o=2;o0},enumerable:!1,configurable:!0}),t.prototype._trySubscribe=function(r){return this._throwIfClosed(),e.prototype._trySubscribe.call(this,r)},t.prototype._subscribe=function(r){return this._throwIfClosed(),this._checkFinalizedStatuses(r),this._innerSubscribe(r)},t.prototype._innerSubscribe=function(r){var o=this,n=this,i=n.hasError,a=n.isStopped,s=n.observers;return i||a?Tr:(this.currentObservers=null,s.push(r),new Ue(function(){o.currentObservers=null,Qe(s,r)}))},t.prototype._checkFinalizedStatuses=function(r){var o=this,n=o.hasError,i=o.thrownError,a=o.isStopped;n?r.error(i):a&&r.complete()},t.prototype.asObservable=function(){var r=new j;return r.source=this,r},t.create=function(r,o){return new To(r,o)},t}(j);var To=function(e){oe(t,e);function t(r,o){var n=e.call(this)||this;return n.destination=r,n.source=o,n}return t.prototype.next=function(r){var o,n;(n=(o=this.destination)===null||o===void 0?void 0:o.next)===null||n===void 0||n.call(o,r)},t.prototype.error=function(r){var o,n;(n=(o=this.destination)===null||o===void 0?void 0:o.error)===null||n===void 0||n.call(o,r)},t.prototype.complete=function(){var r,o;(o=(r=this.destination)===null||r===void 0?void 0:r.complete)===null||o===void 0||o.call(r)},t.prototype._subscribe=function(r){var o,n;return(n=(o=this.source)===null||o===void 0?void 0:o.subscribe(r))!==null&&n!==void 0?n:Tr},t}(g);var _r=function(e){oe(t,e);function t(r){var o=e.call(this)||this;return o._value=r,o}return Object.defineProperty(t.prototype,"value",{get:function(){return this.getValue()},enumerable:!1,configurable:!0}),t.prototype._subscribe=function(r){var o=e.prototype._subscribe.call(this,r);return!o.closed&&r.next(this._value),o},t.prototype.getValue=function(){var r=this,o=r.hasError,n=r.thrownError,i=r._value;if(o)throw n;return this._throwIfClosed(),i},t.prototype.next=function(r){e.prototype.next.call(this,this._value=r)},t}(g);var At={now:function(){return(At.delegate||Date).now()},delegate:void 0};var Ct=function(e){oe(t,e);function t(r,o,n){r===void 0&&(r=1/0),o===void 0&&(o=1/0),n===void 0&&(n=At);var i=e.call(this)||this;return i._bufferSize=r,i._windowTime=o,i._timestampProvider=n,i._buffer=[],i._infiniteTimeWindow=!0,i._infiniteTimeWindow=o===1/0,i._bufferSize=Math.max(1,r),i._windowTime=Math.max(1,o),i}return t.prototype.next=function(r){var o=this,n=o.isStopped,i=o._buffer,a=o._infiniteTimeWindow,s=o._timestampProvider,p=o._windowTime;n||(i.push(r),!a&&i.push(s.now()+p)),this._trimBuffer(),e.prototype.next.call(this,r)},t.prototype._subscribe=function(r){this._throwIfClosed(),this._trimBuffer();for(var o=this._innerSubscribe(r),n=this,i=n._infiniteTimeWindow,a=n._buffer,s=a.slice(),p=0;p0?e.prototype.schedule.call(this,r,o):(this.delay=o,this.state=r,this.scheduler.flush(this),this)},t.prototype.execute=function(r,o){return o>0||this.closed?e.prototype.execute.call(this,r,o):this._execute(r,o)},t.prototype.requestAsyncId=function(r,o,n){return n===void 0&&(n=0),n!=null&&n>0||n==null&&this.delay>0?e.prototype.requestAsyncId.call(this,r,o,n):(r.flush(this),0)},t}(gt);var Lo=function(e){oe(t,e);function t(){return e!==null&&e.apply(this,arguments)||this}return t}(yt);var kr=new Lo(Oo);var Mo=function(e){oe(t,e);function t(r,o){var n=e.call(this,r,o)||this;return n.scheduler=r,n.work=o,n}return t.prototype.requestAsyncId=function(r,o,n){return n===void 0&&(n=0),n!==null&&n>0?e.prototype.requestAsyncId.call(this,r,o,n):(r.actions.push(this),r._scheduled||(r._scheduled=vt.requestAnimationFrame(function(){return r.flush(void 0)})))},t.prototype.recycleAsyncId=function(r,o,n){var i;if(n===void 0&&(n=0),n!=null?n>0:this.delay>0)return e.prototype.recycleAsyncId.call(this,r,o,n);var a=r.actions;o!=null&&((i=a[a.length-1])===null||i===void 0?void 0:i.id)!==o&&(vt.cancelAnimationFrame(o),r._scheduled=void 0)},t}(gt);var _o=function(e){oe(t,e);function t(){return e!==null&&e.apply(this,arguments)||this}return t.prototype.flush=function(r){this._active=!0;var o=this._scheduled;this._scheduled=void 0;var n=this.actions,i;r=r||n.shift();do if(i=r.execute(r.state,r.delay))break;while((r=n[0])&&r.id===o&&n.shift());if(this._active=!1,i){for(;(r=n[0])&&r.id===o&&n.shift();)r.unsubscribe();throw i}},t}(yt);var me=new _o(Mo);var S=new j(function(e){return e.complete()});function Yt(e){return e&&H(e.schedule)}function Hr(e){return e[e.length-1]}function Xe(e){return H(Hr(e))?e.pop():void 0}function ke(e){return Yt(Hr(e))?e.pop():void 0}function Bt(e,t){return typeof Hr(e)=="number"?e.pop():t}var xt=function(e){return e&&typeof e.length=="number"&&typeof e!="function"};function Gt(e){return H(e==null?void 0:e.then)}function Jt(e){return H(e[bt])}function Xt(e){return Symbol.asyncIterator&&H(e==null?void 0:e[Symbol.asyncIterator])}function Zt(e){return new TypeError("You provided "+(e!==null&&typeof e=="object"?"an invalid object":"'"+e+"'")+" where a stream was expected. You can provide an Observable, Promise, ReadableStream, Array, AsyncIterable, or Iterable.")}function Zi(){return typeof Symbol!="function"||!Symbol.iterator?"@@iterator":Symbol.iterator}var er=Zi();function tr(e){return H(e==null?void 0:e[er])}function rr(e){return fo(this,arguments,function(){var r,o,n,i;return Nt(this,function(a){switch(a.label){case 0:r=e.getReader(),a.label=1;case 1:a.trys.push([1,,9,10]),a.label=2;case 2:return[4,nt(r.read())];case 3:return o=a.sent(),n=o.value,i=o.done,i?[4,nt(void 0)]:[3,5];case 4:return[2,a.sent()];case 5:return[4,nt(n)];case 6:return[4,a.sent()];case 7:return a.sent(),[3,2];case 8:return[3,10];case 9:return r.releaseLock(),[7];case 10:return[2]}})})}function or(e){return H(e==null?void 0:e.getReader)}function U(e){if(e instanceof j)return e;if(e!=null){if(Jt(e))return ea(e);if(xt(e))return ta(e);if(Gt(e))return ra(e);if(Xt(e))return Ao(e);if(tr(e))return oa(e);if(or(e))return na(e)}throw Zt(e)}function ea(e){return new j(function(t){var r=e[bt]();if(H(r.subscribe))return r.subscribe(t);throw new TypeError("Provided object does not correctly implement Symbol.observable")})}function ta(e){return new j(function(t){for(var r=0;r=2;return function(o){return o.pipe(e?b(function(n,i){return e(n,i,o)}):le,Te(1),r?De(t):Qo(function(){return new ir}))}}function jr(e){return e<=0?function(){return S}:E(function(t,r){var o=[];t.subscribe(T(r,function(n){o.push(n),e=2,!0))}function pe(e){e===void 0&&(e={});var t=e.connector,r=t===void 0?function(){return new g}:t,o=e.resetOnError,n=o===void 0?!0:o,i=e.resetOnComplete,a=i===void 0?!0:i,s=e.resetOnRefCountZero,p=s===void 0?!0:s;return function(c){var l,f,u,d=0,y=!1,L=!1,X=function(){f==null||f.unsubscribe(),f=void 0},te=function(){X(),l=u=void 0,y=L=!1},J=function(){var k=l;te(),k==null||k.unsubscribe()};return E(function(k,ft){d++,!L&&!y&&X();var qe=u=u!=null?u:r();ft.add(function(){d--,d===0&&!L&&!y&&(f=Ur(J,p))}),qe.subscribe(ft),!l&&d>0&&(l=new at({next:function(Fe){return qe.next(Fe)},error:function(Fe){L=!0,X(),f=Ur(te,n,Fe),qe.error(Fe)},complete:function(){y=!0,X(),f=Ur(te,a),qe.complete()}}),U(k).subscribe(l))})(c)}}function Ur(e,t){for(var r=[],o=2;oe.next(document)),e}function P(e,t=document){return Array.from(t.querySelectorAll(e))}function R(e,t=document){let r=fe(e,t);if(typeof r=="undefined")throw new ReferenceError(`Missing element: expected "${e}" to be present`);return r}function fe(e,t=document){return t.querySelector(e)||void 0}function Ie(){var e,t,r,o;return(o=(r=(t=(e=document.activeElement)==null?void 0:e.shadowRoot)==null?void 0:t.activeElement)!=null?r:document.activeElement)!=null?o:void 0}var wa=O(h(document.body,"focusin"),h(document.body,"focusout")).pipe(_e(1),Q(void 0),m(()=>Ie()||document.body),G(1));function et(e){return wa.pipe(m(t=>e.contains(t)),K())}function $t(e,t){return C(()=>O(h(e,"mouseenter").pipe(m(()=>!0)),h(e,"mouseleave").pipe(m(()=>!1))).pipe(t?Ht(r=>Le(+!r*t)):le,Q(e.matches(":hover"))))}function Jo(e,t){if(typeof t=="string"||typeof t=="number")e.innerHTML+=t.toString();else if(t instanceof Node)e.appendChild(t);else if(Array.isArray(t))for(let r of t)Jo(e,r)}function x(e,t,...r){let o=document.createElement(e);if(t)for(let n of Object.keys(t))typeof t[n]!="undefined"&&(typeof t[n]!="boolean"?o.setAttribute(n,t[n]):o.setAttribute(n,""));for(let n of r)Jo(o,n);return o}function sr(e){if(e>999){let t=+((e-950)%1e3>99);return`${((e+1e-6)/1e3).toFixed(t)}k`}else return e.toString()}function Tt(e){let t=x("script",{src:e});return C(()=>(document.head.appendChild(t),O(h(t,"load"),h(t,"error").pipe(v(()=>$r(()=>new ReferenceError(`Invalid script: ${e}`))))).pipe(m(()=>{}),_(()=>document.head.removeChild(t)),Te(1))))}var Xo=new g,Ta=C(()=>typeof ResizeObserver=="undefined"?Tt("https://unpkg.com/resize-observer-polyfill"):I(void 0)).pipe(m(()=>new ResizeObserver(e=>e.forEach(t=>Xo.next(t)))),v(e=>O(Ye,I(e)).pipe(_(()=>e.disconnect()))),G(1));function ce(e){return{width:e.offsetWidth,height:e.offsetHeight}}function ge(e){let t=e;for(;t.clientWidth===0&&t.parentElement;)t=t.parentElement;return Ta.pipe(w(r=>r.observe(t)),v(r=>Xo.pipe(b(o=>o.target===t),_(()=>r.unobserve(t)))),m(()=>ce(e)),Q(ce(e)))}function St(e){return{width:e.scrollWidth,height:e.scrollHeight}}function cr(e){let t=e.parentElement;for(;t&&(e.scrollWidth<=t.scrollWidth&&e.scrollHeight<=t.scrollHeight);)t=(e=t).parentElement;return t?e:void 0}function Zo(e){let t=[],r=e.parentElement;for(;r;)(e.clientWidth>r.clientWidth||e.clientHeight>r.clientHeight)&&t.push(r),r=(e=r).parentElement;return t.length===0&&t.push(document.documentElement),t}function Ve(e){return{x:e.offsetLeft,y:e.offsetTop}}function en(e){let t=e.getBoundingClientRect();return{x:t.x+window.scrollX,y:t.y+window.scrollY}}function tn(e){return O(h(window,"load"),h(window,"resize")).pipe(Me(0,me),m(()=>Ve(e)),Q(Ve(e)))}function pr(e){return{x:e.scrollLeft,y:e.scrollTop}}function Ne(e){return O(h(e,"scroll"),h(window,"scroll"),h(window,"resize")).pipe(Me(0,me),m(()=>pr(e)),Q(pr(e)))}var rn=new g,Sa=C(()=>I(new IntersectionObserver(e=>{for(let t of e)rn.next(t)},{threshold:0}))).pipe(v(e=>O(Ye,I(e)).pipe(_(()=>e.disconnect()))),G(1));function tt(e){return Sa.pipe(w(t=>t.observe(e)),v(t=>rn.pipe(b(({target:r})=>r===e),_(()=>t.unobserve(e)),m(({isIntersecting:r})=>r))))}function on(e,t=16){return Ne(e).pipe(m(({y:r})=>{let o=ce(e),n=St(e);return r>=n.height-o.height-t}),K())}var lr={drawer:R("[data-md-toggle=drawer]"),search:R("[data-md-toggle=search]")};function nn(e){return lr[e].checked}function Je(e,t){lr[e].checked!==t&&lr[e].click()}function ze(e){let t=lr[e];return h(t,"change").pipe(m(()=>t.checked),Q(t.checked))}function Oa(e,t){switch(e.constructor){case HTMLInputElement:return e.type==="radio"?/^Arrow/.test(t):!0;case HTMLSelectElement:case HTMLTextAreaElement:return!0;default:return e.isContentEditable}}function La(){return O(h(window,"compositionstart").pipe(m(()=>!0)),h(window,"compositionend").pipe(m(()=>!1))).pipe(Q(!1))}function an(){let e=h(window,"keydown").pipe(b(t=>!(t.metaKey||t.ctrlKey)),m(t=>({mode:nn("search")?"search":"global",type:t.key,claim(){t.preventDefault(),t.stopPropagation()}})),b(({mode:t,type:r})=>{if(t==="global"){let o=Ie();if(typeof o!="undefined")return!Oa(o,r)}return!0}),pe());return La().pipe(v(t=>t?S:e))}function ye(){return new URL(location.href)}function lt(e,t=!1){if(B("navigation.instant")&&!t){let r=x("a",{href:e.href});document.body.appendChild(r),r.click(),r.remove()}else location.href=e.href}function sn(){return new g}function cn(){return location.hash.slice(1)}function pn(e){let t=x("a",{href:e});t.addEventListener("click",r=>r.stopPropagation()),t.click()}function Ma(e){return O(h(window,"hashchange"),e).pipe(m(cn),Q(cn()),b(t=>t.length>0),G(1))}function ln(e){return Ma(e).pipe(m(t=>fe(`[id="${t}"]`)),b(t=>typeof t!="undefined"))}function Pt(e){let t=matchMedia(e);return ar(r=>t.addListener(()=>r(t.matches))).pipe(Q(t.matches))}function mn(){let e=matchMedia("print");return O(h(window,"beforeprint").pipe(m(()=>!0)),h(window,"afterprint").pipe(m(()=>!1))).pipe(Q(e.matches))}function Nr(e,t){return e.pipe(v(r=>r?t():S))}function zr(e,t){return new j(r=>{let o=new XMLHttpRequest;return o.open("GET",`${e}`),o.responseType="blob",o.addEventListener("load",()=>{o.status>=200&&o.status<300?(r.next(o.response),r.complete()):r.error(new Error(o.statusText))}),o.addEventListener("error",()=>{r.error(new Error("Network error"))}),o.addEventListener("abort",()=>{r.complete()}),typeof(t==null?void 0:t.progress$)!="undefined"&&(o.addEventListener("progress",n=>{var i;if(n.lengthComputable)t.progress$.next(n.loaded/n.total*100);else{let a=(i=o.getResponseHeader("Content-Length"))!=null?i:0;t.progress$.next(n.loaded/+a*100)}}),t.progress$.next(5)),o.send(),()=>o.abort()})}function je(e,t){return zr(e,t).pipe(v(r=>r.text()),m(r=>JSON.parse(r)),G(1))}function fn(e,t){let r=new DOMParser;return zr(e,t).pipe(v(o=>o.text()),m(o=>r.parseFromString(o,"text/html")),G(1))}function un(e,t){let r=new DOMParser;return zr(e,t).pipe(v(o=>o.text()),m(o=>r.parseFromString(o,"text/xml")),G(1))}function dn(){return{x:Math.max(0,scrollX),y:Math.max(0,scrollY)}}function hn(){return O(h(window,"scroll",{passive:!0}),h(window,"resize",{passive:!0})).pipe(m(dn),Q(dn()))}function bn(){return{width:innerWidth,height:innerHeight}}function vn(){return h(window,"resize",{passive:!0}).pipe(m(bn),Q(bn()))}function gn(){return z([hn(),vn()]).pipe(m(([e,t])=>({offset:e,size:t})),G(1))}function mr(e,{viewport$:t,header$:r}){let o=t.pipe(ee("size")),n=z([o,r]).pipe(m(()=>Ve(e)));return z([r,t,n]).pipe(m(([{height:i},{offset:a,size:s},{x:p,y:c}])=>({offset:{x:a.x-p,y:a.y-c+i},size:s})))}function _a(e){return h(e,"message",t=>t.data)}function Aa(e){let t=new g;return t.subscribe(r=>e.postMessage(r)),t}function yn(e,t=new Worker(e)){let r=_a(t),o=Aa(t),n=new g;n.subscribe(o);let i=o.pipe(Z(),ie(!0));return n.pipe(Z(),Re(r.pipe(W(i))),pe())}var Ca=R("#__config"),Ot=JSON.parse(Ca.textContent);Ot.base=`${new URL(Ot.base,ye())}`;function xe(){return Ot}function B(e){return Ot.features.includes(e)}function Ee(e,t){return typeof t!="undefined"?Ot.translations[e].replace("#",t.toString()):Ot.translations[e]}function Se(e,t=document){return R(`[data-md-component=${e}]`,t)}function ae(e,t=document){return P(`[data-md-component=${e}]`,t)}function ka(e){let t=R(".md-typeset > :first-child",e);return h(t,"click",{once:!0}).pipe(m(()=>R(".md-typeset",e)),m(r=>({hash:__md_hash(r.innerHTML)})))}function xn(e){if(!B("announce.dismiss")||!e.childElementCount)return S;if(!e.hidden){let t=R(".md-typeset",e);__md_hash(t.innerHTML)===__md_get("__announce")&&(e.hidden=!0)}return C(()=>{let t=new g;return t.subscribe(({hash:r})=>{e.hidden=!0,__md_set("__announce",r)}),ka(e).pipe(w(r=>t.next(r)),_(()=>t.complete()),m(r=>$({ref:e},r)))})}function Ha(e,{target$:t}){return t.pipe(m(r=>({hidden:r!==e})))}function En(e,t){let r=new g;return r.subscribe(({hidden:o})=>{e.hidden=o}),Ha(e,t).pipe(w(o=>r.next(o)),_(()=>r.complete()),m(o=>$({ref:e},o)))}function Rt(e,t){return t==="inline"?x("div",{class:"md-tooltip md-tooltip--inline",id:e,role:"tooltip"},x("div",{class:"md-tooltip__inner md-typeset"})):x("div",{class:"md-tooltip",id:e,role:"tooltip"},x("div",{class:"md-tooltip__inner md-typeset"}))}function wn(...e){return x("div",{class:"md-tooltip2",role:"tooltip"},x("div",{class:"md-tooltip2__inner md-typeset"},e))}function Tn(e,t){if(t=t?`${t}_annotation_${e}`:void 0,t){let r=t?`#${t}`:void 0;return x("aside",{class:"md-annotation",tabIndex:0},Rt(t),x("a",{href:r,class:"md-annotation__index",tabIndex:-1},x("span",{"data-md-annotation-id":e})))}else return x("aside",{class:"md-annotation",tabIndex:0},Rt(t),x("span",{class:"md-annotation__index",tabIndex:-1},x("span",{"data-md-annotation-id":e})))}function Sn(e){return x("button",{class:"md-clipboard md-icon",title:Ee("clipboard.copy"),"data-clipboard-target":`#${e} > code`})}var Ln=Mt(qr());function Qr(e,t){let r=t&2,o=t&1,n=Object.keys(e.terms).filter(p=>!e.terms[p]).reduce((p,c)=>[...p,x("del",null,(0,Ln.default)(c))," "],[]).slice(0,-1),i=xe(),a=new URL(e.location,i.base);B("search.highlight")&&a.searchParams.set("h",Object.entries(e.terms).filter(([,p])=>p).reduce((p,[c])=>`${p} ${c}`.trim(),""));let{tags:s}=xe();return x("a",{href:`${a}`,class:"md-search-result__link",tabIndex:-1},x("article",{class:"md-search-result__article md-typeset","data-md-score":e.score.toFixed(2)},r>0&&x("div",{class:"md-search-result__icon md-icon"}),r>0&&x("h1",null,e.title),r<=0&&x("h2",null,e.title),o>0&&e.text.length>0&&e.text,e.tags&&x("nav",{class:"md-tags"},e.tags.map(p=>{let c=s?p in s?`md-tag-icon md-tag--${s[p]}`:"md-tag-icon":"";return x("span",{class:`md-tag ${c}`},p)})),o>0&&n.length>0&&x("p",{class:"md-search-result__terms"},Ee("search.result.term.missing"),": ",...n)))}function Mn(e){let t=e[0].score,r=[...e],o=xe(),n=r.findIndex(l=>!`${new URL(l.location,o.base)}`.includes("#")),[i]=r.splice(n,1),a=r.findIndex(l=>l.scoreQr(l,1)),...p.length?[x("details",{class:"md-search-result__more"},x("summary",{tabIndex:-1},x("div",null,p.length>0&&p.length===1?Ee("search.result.more.one"):Ee("search.result.more.other",p.length))),...p.map(l=>Qr(l,1)))]:[]];return x("li",{class:"md-search-result__item"},c)}function _n(e){return x("ul",{class:"md-source__facts"},Object.entries(e).map(([t,r])=>x("li",{class:`md-source__fact md-source__fact--${t}`},typeof r=="number"?sr(r):r)))}function Kr(e){let t=`tabbed-control tabbed-control--${e}`;return x("div",{class:t,hidden:!0},x("button",{class:"tabbed-button",tabIndex:-1,"aria-hidden":"true"}))}function An(e){return x("div",{class:"md-typeset__scrollwrap"},x("div",{class:"md-typeset__table"},e))}function Ra(e){var o;let t=xe(),r=new URL(`../${e.version}/`,t.base);return x("li",{class:"md-version__item"},x("a",{href:`${r}`,class:"md-version__link"},e.title,((o=t.version)==null?void 0:o.alias)&&e.aliases.length>0&&x("span",{class:"md-version__alias"},e.aliases[0])))}function Cn(e,t){var o;let r=xe();return e=e.filter(n=>{var i;return!((i=n.properties)!=null&&i.hidden)}),x("div",{class:"md-version"},x("button",{class:"md-version__current","aria-label":Ee("select.version")},t.title,((o=r.version)==null?void 0:o.alias)&&t.aliases.length>0&&x("span",{class:"md-version__alias"},t.aliases[0])),x("ul",{class:"md-version__list"},e.map(Ra)))}var Ia=0;function ja(e){let t=z([et(e),$t(e)]).pipe(m(([o,n])=>o||n),K()),r=C(()=>Zo(e)).pipe(ne(Ne),pt(1),He(t),m(()=>en(e)));return t.pipe(Ae(o=>o),v(()=>z([t,r])),m(([o,n])=>({active:o,offset:n})),pe())}function Fa(e,t){let{content$:r,viewport$:o}=t,n=`__tooltip2_${Ia++}`;return C(()=>{let i=new g,a=new _r(!1);i.pipe(Z(),ie(!1)).subscribe(a);let s=a.pipe(Ht(c=>Le(+!c*250,kr)),K(),v(c=>c?r:S),w(c=>c.id=n),pe());z([i.pipe(m(({active:c})=>c)),s.pipe(v(c=>$t(c,250)),Q(!1))]).pipe(m(c=>c.some(l=>l))).subscribe(a);let p=a.pipe(b(c=>c),re(s,o),m(([c,l,{size:f}])=>{let u=e.getBoundingClientRect(),d=u.width/2;if(l.role==="tooltip")return{x:d,y:8+u.height};if(u.y>=f.height/2){let{height:y}=ce(l);return{x:d,y:-16-y}}else return{x:d,y:16+u.height}}));return z([s,i,p]).subscribe(([c,{offset:l},f])=>{c.style.setProperty("--md-tooltip-host-x",`${l.x}px`),c.style.setProperty("--md-tooltip-host-y",`${l.y}px`),c.style.setProperty("--md-tooltip-x",`${f.x}px`),c.style.setProperty("--md-tooltip-y",`${f.y}px`),c.classList.toggle("md-tooltip2--top",f.y<0),c.classList.toggle("md-tooltip2--bottom",f.y>=0)}),a.pipe(b(c=>c),re(s,(c,l)=>l),b(c=>c.role==="tooltip")).subscribe(c=>{let l=ce(R(":scope > *",c));c.style.setProperty("--md-tooltip-width",`${l.width}px`),c.style.setProperty("--md-tooltip-tail","0px")}),a.pipe(K(),ve(me),re(s)).subscribe(([c,l])=>{l.classList.toggle("md-tooltip2--active",c)}),z([a.pipe(b(c=>c)),s]).subscribe(([c,l])=>{l.role==="dialog"?(e.setAttribute("aria-controls",n),e.setAttribute("aria-haspopup","dialog")):e.setAttribute("aria-describedby",n)}),a.pipe(b(c=>!c)).subscribe(()=>{e.removeAttribute("aria-controls"),e.removeAttribute("aria-describedby"),e.removeAttribute("aria-haspopup")}),ja(e).pipe(w(c=>i.next(c)),_(()=>i.complete()),m(c=>$({ref:e},c)))})}function mt(e,{viewport$:t},r=document.body){return Fa(e,{content$:new j(o=>{let n=e.title,i=wn(n);return o.next(i),e.removeAttribute("title"),r.append(i),()=>{i.remove(),e.setAttribute("title",n)}}),viewport$:t})}function Ua(e,t){let r=C(()=>z([tn(e),Ne(t)])).pipe(m(([{x:o,y:n},i])=>{let{width:a,height:s}=ce(e);return{x:o-i.x+a/2,y:n-i.y+s/2}}));return et(e).pipe(v(o=>r.pipe(m(n=>({active:o,offset:n})),Te(+!o||1/0))))}function kn(e,t,{target$:r}){let[o,n]=Array.from(e.children);return C(()=>{let i=new g,a=i.pipe(Z(),ie(!0));return i.subscribe({next({offset:s}){e.style.setProperty("--md-tooltip-x",`${s.x}px`),e.style.setProperty("--md-tooltip-y",`${s.y}px`)},complete(){e.style.removeProperty("--md-tooltip-x"),e.style.removeProperty("--md-tooltip-y")}}),tt(e).pipe(W(a)).subscribe(s=>{e.toggleAttribute("data-md-visible",s)}),O(i.pipe(b(({active:s})=>s)),i.pipe(_e(250),b(({active:s})=>!s))).subscribe({next({active:s}){s?e.prepend(o):o.remove()},complete(){e.prepend(o)}}),i.pipe(Me(16,me)).subscribe(({active:s})=>{o.classList.toggle("md-tooltip--active",s)}),i.pipe(pt(125,me),b(()=>!!e.offsetParent),m(()=>e.offsetParent.getBoundingClientRect()),m(({x:s})=>s)).subscribe({next(s){s?e.style.setProperty("--md-tooltip-0",`${-s}px`):e.style.removeProperty("--md-tooltip-0")},complete(){e.style.removeProperty("--md-tooltip-0")}}),h(n,"click").pipe(W(a),b(s=>!(s.metaKey||s.ctrlKey))).subscribe(s=>{s.stopPropagation(),s.preventDefault()}),h(n,"mousedown").pipe(W(a),re(i)).subscribe(([s,{active:p}])=>{var c;if(s.button!==0||s.metaKey||s.ctrlKey)s.preventDefault();else if(p){s.preventDefault();let l=e.parentElement.closest(".md-annotation");l instanceof HTMLElement?l.focus():(c=Ie())==null||c.blur()}}),r.pipe(W(a),b(s=>s===o),Ge(125)).subscribe(()=>e.focus()),Ua(e,t).pipe(w(s=>i.next(s)),_(()=>i.complete()),m(s=>$({ref:e},s)))})}function Wa(e){return e.tagName==="CODE"?P(".c, .c1, .cm",e):[e]}function Da(e){let t=[];for(let r of Wa(e)){let o=[],n=document.createNodeIterator(r,NodeFilter.SHOW_TEXT);for(let i=n.nextNode();i;i=n.nextNode())o.push(i);for(let i of o){let a;for(;a=/(\(\d+\))(!)?/.exec(i.textContent);){let[,s,p]=a;if(typeof p=="undefined"){let c=i.splitText(a.index);i=c.splitText(s.length),t.push(c)}else{i.textContent=s,t.push(i);break}}}}return t}function Hn(e,t){t.append(...Array.from(e.childNodes))}function fr(e,t,{target$:r,print$:o}){let n=t.closest("[id]"),i=n==null?void 0:n.id,a=new Map;for(let s of Da(t)){let[,p]=s.textContent.match(/\((\d+)\)/);fe(`:scope > li:nth-child(${p})`,e)&&(a.set(p,Tn(p,i)),s.replaceWith(a.get(p)))}return a.size===0?S:C(()=>{let s=new g,p=s.pipe(Z(),ie(!0)),c=[];for(let[l,f]of a)c.push([R(".md-typeset",f),R(`:scope > li:nth-child(${l})`,e)]);return o.pipe(W(p)).subscribe(l=>{e.hidden=!l,e.classList.toggle("md-annotation-list",l);for(let[f,u]of c)l?Hn(f,u):Hn(u,f)}),O(...[...a].map(([,l])=>kn(l,t,{target$:r}))).pipe(_(()=>s.complete()),pe())})}function $n(e){if(e.nextElementSibling){let t=e.nextElementSibling;if(t.tagName==="OL")return t;if(t.tagName==="P"&&!t.children.length)return $n(t)}}function Pn(e,t){return C(()=>{let r=$n(e);return typeof r!="undefined"?fr(r,e,t):S})}var Rn=Mt(Br());var Va=0;function In(e){if(e.nextElementSibling){let t=e.nextElementSibling;if(t.tagName==="OL")return t;if(t.tagName==="P"&&!t.children.length)return In(t)}}function Na(e){return ge(e).pipe(m(({width:t})=>({scrollable:St(e).width>t})),ee("scrollable"))}function jn(e,t){let{matches:r}=matchMedia("(hover)"),o=C(()=>{let n=new g,i=n.pipe(jr(1));n.subscribe(({scrollable:c})=>{c&&r?e.setAttribute("tabindex","0"):e.removeAttribute("tabindex")});let a=[];if(Rn.default.isSupported()&&(e.closest(".copy")||B("content.code.copy")&&!e.closest(".no-copy"))){let c=e.closest("pre");c.id=`__code_${Va++}`;let l=Sn(c.id);c.insertBefore(l,e),B("content.tooltips")&&a.push(mt(l,{viewport$}))}let s=e.closest(".highlight");if(s instanceof HTMLElement){let c=In(s);if(typeof c!="undefined"&&(s.classList.contains("annotate")||B("content.code.annotate"))){let l=fr(c,e,t);a.push(ge(s).pipe(W(i),m(({width:f,height:u})=>f&&u),K(),v(f=>f?l:S)))}}return P(":scope > span[id]",e).length&&e.classList.add("md-code__content"),Na(e).pipe(w(c=>n.next(c)),_(()=>n.complete()),m(c=>$({ref:e},c)),Re(...a))});return B("content.lazy")?tt(e).pipe(b(n=>n),Te(1),v(()=>o)):o}function za(e,{target$:t,print$:r}){let o=!0;return O(t.pipe(m(n=>n.closest("details:not([open])")),b(n=>e===n),m(()=>({action:"open",reveal:!0}))),r.pipe(b(n=>n||!o),w(()=>o=e.open),m(n=>({action:n?"open":"close"}))))}function Fn(e,t){return C(()=>{let r=new g;return r.subscribe(({action:o,reveal:n})=>{e.toggleAttribute("open",o==="open"),n&&e.scrollIntoView()}),za(e,t).pipe(w(o=>r.next(o)),_(()=>r.complete()),m(o=>$({ref:e},o)))})}var Un=".node circle,.node ellipse,.node path,.node polygon,.node rect{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}marker{fill:var(--md-mermaid-edge-color)!important}.edgeLabel .label rect{fill:#0000}.label{color:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.label foreignObject{line-height:normal;overflow:visible}.label div .edgeLabel{color:var(--md-mermaid-label-fg-color)}.edgeLabel,.edgeLabel p,.label div .edgeLabel{background-color:var(--md-mermaid-label-bg-color)}.edgeLabel,.edgeLabel p{fill:var(--md-mermaid-label-bg-color);color:var(--md-mermaid-edge-color)}.edgePath .path,.flowchart-link{stroke:var(--md-mermaid-edge-color);stroke-width:.05rem}.edgePath .arrowheadPath{fill:var(--md-mermaid-edge-color);stroke:none}.cluster rect{fill:var(--md-default-fg-color--lightest);stroke:var(--md-default-fg-color--lighter)}.cluster span{color:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}g #flowchart-circleEnd,g #flowchart-circleStart,g #flowchart-crossEnd,g #flowchart-crossStart,g #flowchart-pointEnd,g #flowchart-pointStart{stroke:none}g.classGroup line,g.classGroup rect{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}g.classGroup text{fill:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.classLabel .box{fill:var(--md-mermaid-label-bg-color);background-color:var(--md-mermaid-label-bg-color);opacity:1}.classLabel .label{fill:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.node .divider{stroke:var(--md-mermaid-node-fg-color)}.relation{stroke:var(--md-mermaid-edge-color)}.cardinality{fill:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.cardinality text{fill:inherit!important}defs #classDiagram-compositionEnd,defs #classDiagram-compositionStart,defs #classDiagram-dependencyEnd,defs #classDiagram-dependencyStart,defs #classDiagram-extensionEnd,defs #classDiagram-extensionStart{fill:var(--md-mermaid-edge-color)!important;stroke:var(--md-mermaid-edge-color)!important}defs #classDiagram-aggregationEnd,defs #classDiagram-aggregationStart{fill:var(--md-mermaid-label-bg-color)!important;stroke:var(--md-mermaid-edge-color)!important}g.stateGroup rect{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}g.stateGroup .state-title{fill:var(--md-mermaid-label-fg-color)!important;font-family:var(--md-mermaid-font-family)}g.stateGroup .composit{fill:var(--md-mermaid-label-bg-color)}.nodeLabel,.nodeLabel p{color:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}a .nodeLabel{text-decoration:underline}.node circle.state-end,.node circle.state-start,.start-state{fill:var(--md-mermaid-edge-color);stroke:none}.end-state-inner,.end-state-outer{fill:var(--md-mermaid-edge-color)}.end-state-inner,.node circle.state-end{stroke:var(--md-mermaid-label-bg-color)}.transition{stroke:var(--md-mermaid-edge-color)}[id^=state-fork] rect,[id^=state-join] rect{fill:var(--md-mermaid-edge-color)!important;stroke:none!important}.statediagram-cluster.statediagram-cluster .inner{fill:var(--md-default-bg-color)}.statediagram-cluster rect{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}.statediagram-state rect.divider{fill:var(--md-default-fg-color--lightest);stroke:var(--md-default-fg-color--lighter)}defs #statediagram-barbEnd{stroke:var(--md-mermaid-edge-color)}.attributeBoxEven,.attributeBoxOdd{fill:var(--md-mermaid-node-bg-color);stroke:var(--md-mermaid-node-fg-color)}.entityBox{fill:var(--md-mermaid-label-bg-color);stroke:var(--md-mermaid-node-fg-color)}.entityLabel{fill:var(--md-mermaid-label-fg-color);font-family:var(--md-mermaid-font-family)}.relationshipLabelBox{fill:var(--md-mermaid-label-bg-color);fill-opacity:1;background-color:var(--md-mermaid-label-bg-color);opacity:1}.relationshipLabel{fill:var(--md-mermaid-label-fg-color)}.relationshipLine{stroke:var(--md-mermaid-edge-color)}defs #ONE_OR_MORE_END *,defs #ONE_OR_MORE_START *,defs #ONLY_ONE_END *,defs #ONLY_ONE_START *,defs #ZERO_OR_MORE_END *,defs #ZERO_OR_MORE_START *,defs #ZERO_OR_ONE_END *,defs #ZERO_OR_ONE_START *{stroke:var(--md-mermaid-edge-color)!important}defs #ZERO_OR_MORE_END circle,defs #ZERO_OR_MORE_START circle{fill:var(--md-mermaid-label-bg-color)}.actor{fill:var(--md-mermaid-sequence-actor-bg-color);stroke:var(--md-mermaid-sequence-actor-border-color)}text.actor>tspan{fill:var(--md-mermaid-sequence-actor-fg-color);font-family:var(--md-mermaid-font-family)}line{stroke:var(--md-mermaid-sequence-actor-line-color)}.actor-man circle,.actor-man line{fill:var(--md-mermaid-sequence-actorman-bg-color);stroke:var(--md-mermaid-sequence-actorman-line-color)}.messageLine0,.messageLine1{stroke:var(--md-mermaid-sequence-message-line-color)}.note{fill:var(--md-mermaid-sequence-note-bg-color);stroke:var(--md-mermaid-sequence-note-border-color)}.loopText,.loopText>tspan,.messageText,.noteText>tspan{stroke:none;font-family:var(--md-mermaid-font-family)!important}.messageText{fill:var(--md-mermaid-sequence-message-fg-color)}.loopText,.loopText>tspan{fill:var(--md-mermaid-sequence-loop-fg-color)}.noteText>tspan{fill:var(--md-mermaid-sequence-note-fg-color)}#arrowhead path{fill:var(--md-mermaid-sequence-message-line-color);stroke:none}.loopLine{fill:var(--md-mermaid-sequence-loop-bg-color);stroke:var(--md-mermaid-sequence-loop-border-color)}.labelBox{fill:var(--md-mermaid-sequence-label-bg-color);stroke:none}.labelText,.labelText>span{fill:var(--md-mermaid-sequence-label-fg-color);font-family:var(--md-mermaid-font-family)}.sequenceNumber{fill:var(--md-mermaid-sequence-number-fg-color)}rect.rect{fill:var(--md-mermaid-sequence-box-bg-color);stroke:none}rect.rect+text.text{fill:var(--md-mermaid-sequence-box-fg-color)}defs #sequencenumber{fill:var(--md-mermaid-sequence-number-bg-color)!important}";var Gr,Qa=0;function Ka(){return typeof mermaid=="undefined"||mermaid instanceof Element?Tt("https://unpkg.com/mermaid@11/dist/mermaid.min.js"):I(void 0)}function Wn(e){return e.classList.remove("mermaid"),Gr||(Gr=Ka().pipe(w(()=>mermaid.initialize({startOnLoad:!1,themeCSS:Un,sequence:{actorFontSize:"16px",messageFontSize:"16px",noteFontSize:"16px"}})),m(()=>{}),G(1))),Gr.subscribe(()=>co(this,null,function*(){e.classList.add("mermaid");let t=`__mermaid_${Qa++}`,r=x("div",{class:"mermaid"}),o=e.textContent,{svg:n,fn:i}=yield mermaid.render(t,o),a=r.attachShadow({mode:"closed"});a.innerHTML=n,e.replaceWith(r),i==null||i(a)})),Gr.pipe(m(()=>({ref:e})))}var Dn=x("table");function Vn(e){return e.replaceWith(Dn),Dn.replaceWith(An(e)),I({ref:e})}function Ya(e){let t=e.find(r=>r.checked)||e[0];return O(...e.map(r=>h(r,"change").pipe(m(()=>R(`label[for="${r.id}"]`))))).pipe(Q(R(`label[for="${t.id}"]`)),m(r=>({active:r})))}function Nn(e,{viewport$:t,target$:r}){let o=R(".tabbed-labels",e),n=P(":scope > input",e),i=Kr("prev");e.append(i);let a=Kr("next");return e.append(a),C(()=>{let s=new g,p=s.pipe(Z(),ie(!0));z([s,ge(e),tt(e)]).pipe(W(p),Me(1,me)).subscribe({next([{active:c},l]){let f=Ve(c),{width:u}=ce(c);e.style.setProperty("--md-indicator-x",`${f.x}px`),e.style.setProperty("--md-indicator-width",`${u}px`);let d=pr(o);(f.xd.x+l.width)&&o.scrollTo({left:Math.max(0,f.x-16),behavior:"smooth"})},complete(){e.style.removeProperty("--md-indicator-x"),e.style.removeProperty("--md-indicator-width")}}),z([Ne(o),ge(o)]).pipe(W(p)).subscribe(([c,l])=>{let f=St(o);i.hidden=c.x<16,a.hidden=c.x>f.width-l.width-16}),O(h(i,"click").pipe(m(()=>-1)),h(a,"click").pipe(m(()=>1))).pipe(W(p)).subscribe(c=>{let{width:l}=ce(o);o.scrollBy({left:l*c,behavior:"smooth"})}),r.pipe(W(p),b(c=>n.includes(c))).subscribe(c=>c.click()),o.classList.add("tabbed-labels--linked");for(let c of n){let l=R(`label[for="${c.id}"]`);l.replaceChildren(x("a",{href:`#${l.htmlFor}`,tabIndex:-1},...Array.from(l.childNodes))),h(l.firstElementChild,"click").pipe(W(p),b(f=>!(f.metaKey||f.ctrlKey)),w(f=>{f.preventDefault(),f.stopPropagation()})).subscribe(()=>{history.replaceState({},"",`#${l.htmlFor}`),l.click()})}return B("content.tabs.link")&&s.pipe(Ce(1),re(t)).subscribe(([{active:c},{offset:l}])=>{let f=c.innerText.trim();if(c.hasAttribute("data-md-switching"))c.removeAttribute("data-md-switching");else{let u=e.offsetTop-l.y;for(let y of P("[data-tabs]"))for(let L of P(":scope > input",y)){let X=R(`label[for="${L.id}"]`);if(X!==c&&X.innerText.trim()===f){X.setAttribute("data-md-switching",""),L.click();break}}window.scrollTo({top:e.offsetTop-u});let d=__md_get("__tabs")||[];__md_set("__tabs",[...new Set([f,...d])])}}),s.pipe(W(p)).subscribe(()=>{for(let c of P("audio, video",e))c.pause()}),Ya(n).pipe(w(c=>s.next(c)),_(()=>s.complete()),m(c=>$({ref:e},c)))}).pipe(Ke(se))}function zn(e,{viewport$:t,target$:r,print$:o}){return O(...P(".annotate:not(.highlight)",e).map(n=>Pn(n,{target$:r,print$:o})),...P("pre:not(.mermaid) > code",e).map(n=>jn(n,{target$:r,print$:o})),...P("pre.mermaid",e).map(n=>Wn(n)),...P("table:not([class])",e).map(n=>Vn(n)),...P("details",e).map(n=>Fn(n,{target$:r,print$:o})),...P("[data-tabs]",e).map(n=>Nn(n,{viewport$:t,target$:r})),...P("[title]",e).filter(()=>B("content.tooltips")).map(n=>mt(n,{viewport$:t})))}function Ba(e,{alert$:t}){return t.pipe(v(r=>O(I(!0),I(!1).pipe(Ge(2e3))).pipe(m(o=>({message:r,active:o})))))}function qn(e,t){let r=R(".md-typeset",e);return C(()=>{let o=new g;return o.subscribe(({message:n,active:i})=>{e.classList.toggle("md-dialog--active",i),r.textContent=n}),Ba(e,t).pipe(w(n=>o.next(n)),_(()=>o.complete()),m(n=>$({ref:e},n)))})}var Ga=0;function Ja(e,t){document.body.append(e);let{width:r}=ce(e);e.style.setProperty("--md-tooltip-width",`${r}px`),e.remove();let o=cr(t),n=typeof o!="undefined"?Ne(o):I({x:0,y:0}),i=O(et(t),$t(t)).pipe(K());return z([i,n]).pipe(m(([a,s])=>{let{x:p,y:c}=Ve(t),l=ce(t),f=t.closest("table");return f&&t.parentElement&&(p+=f.offsetLeft+t.parentElement.offsetLeft,c+=f.offsetTop+t.parentElement.offsetTop),{active:a,offset:{x:p-s.x+l.width/2-r/2,y:c-s.y+l.height+8}}}))}function Qn(e){let t=e.title;if(!t.length)return S;let r=`__tooltip_${Ga++}`,o=Rt(r,"inline"),n=R(".md-typeset",o);return n.innerHTML=t,C(()=>{let i=new g;return i.subscribe({next({offset:a}){o.style.setProperty("--md-tooltip-x",`${a.x}px`),o.style.setProperty("--md-tooltip-y",`${a.y}px`)},complete(){o.style.removeProperty("--md-tooltip-x"),o.style.removeProperty("--md-tooltip-y")}}),O(i.pipe(b(({active:a})=>a)),i.pipe(_e(250),b(({active:a})=>!a))).subscribe({next({active:a}){a?(e.insertAdjacentElement("afterend",o),e.setAttribute("aria-describedby",r),e.removeAttribute("title")):(o.remove(),e.removeAttribute("aria-describedby"),e.setAttribute("title",t))},complete(){o.remove(),e.removeAttribute("aria-describedby"),e.setAttribute("title",t)}}),i.pipe(Me(16,me)).subscribe(({active:a})=>{o.classList.toggle("md-tooltip--active",a)}),i.pipe(pt(125,me),b(()=>!!e.offsetParent),m(()=>e.offsetParent.getBoundingClientRect()),m(({x:a})=>a)).subscribe({next(a){a?o.style.setProperty("--md-tooltip-0",`${-a}px`):o.style.removeProperty("--md-tooltip-0")},complete(){o.style.removeProperty("--md-tooltip-0")}}),Ja(o,e).pipe(w(a=>i.next(a)),_(()=>i.complete()),m(a=>$({ref:e},a)))}).pipe(Ke(se))}function Xa({viewport$:e}){if(!B("header.autohide"))return I(!1);let t=e.pipe(m(({offset:{y:n}})=>n),Be(2,1),m(([n,i])=>[nMath.abs(i-n.y)>100),m(([,[n]])=>n),K()),o=ze("search");return z([e,o]).pipe(m(([{offset:n},i])=>n.y>400&&!i),K(),v(n=>n?r:I(!1)),Q(!1))}function Kn(e,t){return C(()=>z([ge(e),Xa(t)])).pipe(m(([{height:r},o])=>({height:r,hidden:o})),K((r,o)=>r.height===o.height&&r.hidden===o.hidden),G(1))}function Yn(e,{header$:t,main$:r}){return C(()=>{let o=new g,n=o.pipe(Z(),ie(!0));o.pipe(ee("active"),He(t)).subscribe(([{active:a},{hidden:s}])=>{e.classList.toggle("md-header--shadow",a&&!s),e.hidden=s});let i=ue(P("[title]",e)).pipe(b(()=>B("content.tooltips")),ne(a=>Qn(a)));return r.subscribe(o),t.pipe(W(n),m(a=>$({ref:e},a)),Re(i.pipe(W(n))))})}function Za(e,{viewport$:t,header$:r}){return mr(e,{viewport$:t,header$:r}).pipe(m(({offset:{y:o}})=>{let{height:n}=ce(e);return{active:o>=n}}),ee("active"))}function Bn(e,t){return C(()=>{let r=new g;r.subscribe({next({active:n}){e.classList.toggle("md-header__title--active",n)},complete(){e.classList.remove("md-header__title--active")}});let o=fe(".md-content h1");return typeof o=="undefined"?S:Za(o,t).pipe(w(n=>r.next(n)),_(()=>r.complete()),m(n=>$({ref:e},n)))})}function Gn(e,{viewport$:t,header$:r}){let o=r.pipe(m(({height:i})=>i),K()),n=o.pipe(v(()=>ge(e).pipe(m(({height:i})=>({top:e.offsetTop,bottom:e.offsetTop+i})),ee("bottom"))));return z([o,n,t]).pipe(m(([i,{top:a,bottom:s},{offset:{y:p},size:{height:c}}])=>(c=Math.max(0,c-Math.max(0,a-p,i)-Math.max(0,c+p-s)),{offset:a-i,height:c,active:a-i<=p})),K((i,a)=>i.offset===a.offset&&i.height===a.height&&i.active===a.active))}function es(e){let t=__md_get("__palette")||{index:e.findIndex(o=>matchMedia(o.getAttribute("data-md-color-media")).matches)},r=Math.max(0,Math.min(t.index,e.length-1));return I(...e).pipe(ne(o=>h(o,"change").pipe(m(()=>o))),Q(e[r]),m(o=>({index:e.indexOf(o),color:{media:o.getAttribute("data-md-color-media"),scheme:o.getAttribute("data-md-color-scheme"),primary:o.getAttribute("data-md-color-primary"),accent:o.getAttribute("data-md-color-accent")}})),G(1))}function Jn(e){let t=P("input",e),r=x("meta",{name:"theme-color"});document.head.appendChild(r);let o=x("meta",{name:"color-scheme"});document.head.appendChild(o);let n=Pt("(prefers-color-scheme: light)");return C(()=>{let i=new g;return i.subscribe(a=>{if(document.body.setAttribute("data-md-color-switching",""),a.color.media==="(prefers-color-scheme)"){let s=matchMedia("(prefers-color-scheme: light)"),p=document.querySelector(s.matches?"[data-md-color-media='(prefers-color-scheme: light)']":"[data-md-color-media='(prefers-color-scheme: dark)']");a.color.scheme=p.getAttribute("data-md-color-scheme"),a.color.primary=p.getAttribute("data-md-color-primary"),a.color.accent=p.getAttribute("data-md-color-accent")}for(let[s,p]of Object.entries(a.color))document.body.setAttribute(`data-md-color-${s}`,p);for(let s=0;sa.key==="Enter"),re(i,(a,s)=>s)).subscribe(({index:a})=>{a=(a+1)%t.length,t[a].click(),t[a].focus()}),i.pipe(m(()=>{let a=Se("header"),s=window.getComputedStyle(a);return o.content=s.colorScheme,s.backgroundColor.match(/\d+/g).map(p=>(+p).toString(16).padStart(2,"0")).join("")})).subscribe(a=>r.content=`#${a}`),i.pipe(ve(se)).subscribe(()=>{document.body.removeAttribute("data-md-color-switching")}),es(t).pipe(W(n.pipe(Ce(1))),ct(),w(a=>i.next(a)),_(()=>i.complete()),m(a=>$({ref:e},a)))})}function Xn(e,{progress$:t}){return C(()=>{let r=new g;return r.subscribe(({value:o})=>{e.style.setProperty("--md-progress-value",`${o}`)}),t.pipe(w(o=>r.next({value:o})),_(()=>r.complete()),m(o=>({ref:e,value:o})))})}var Jr=Mt(Br());function ts(e){e.setAttribute("data-md-copying","");let t=e.closest("[data-copy]"),r=t?t.getAttribute("data-copy"):e.innerText;return e.removeAttribute("data-md-copying"),r.trimEnd()}function Zn({alert$:e}){Jr.default.isSupported()&&new j(t=>{new Jr.default("[data-clipboard-target], [data-clipboard-text]",{text:r=>r.getAttribute("data-clipboard-text")||ts(R(r.getAttribute("data-clipboard-target")))}).on("success",r=>t.next(r))}).pipe(w(t=>{t.trigger.focus()}),m(()=>Ee("clipboard.copied"))).subscribe(e)}function ei(e,t){return e.protocol=t.protocol,e.hostname=t.hostname,e}function rs(e,t){let r=new Map;for(let o of P("url",e)){let n=R("loc",o),i=[ei(new URL(n.textContent),t)];r.set(`${i[0]}`,i);for(let a of P("[rel=alternate]",o)){let s=a.getAttribute("href");s!=null&&i.push(ei(new URL(s),t))}}return r}function ur(e){return un(new URL("sitemap.xml",e)).pipe(m(t=>rs(t,new URL(e))),de(()=>I(new Map)))}function os(e,t){if(!(e.target instanceof Element))return S;let r=e.target.closest("a");if(r===null)return S;if(r.target||e.metaKey||e.ctrlKey)return S;let o=new URL(r.href);return o.search=o.hash="",t.has(`${o}`)?(e.preventDefault(),I(new URL(r.href))):S}function ti(e){let t=new Map;for(let r of P(":scope > *",e.head))t.set(r.outerHTML,r);return t}function ri(e){for(let t of P("[href], [src]",e))for(let r of["href","src"]){let o=t.getAttribute(r);if(o&&!/^(?:[a-z]+:)?\/\//i.test(o)){t[r]=t[r];break}}return I(e)}function ns(e){for(let o of["[data-md-component=announce]","[data-md-component=container]","[data-md-component=header-topic]","[data-md-component=outdated]","[data-md-component=logo]","[data-md-component=skip]",...B("navigation.tabs.sticky")?["[data-md-component=tabs]"]:[]]){let n=fe(o),i=fe(o,e);typeof n!="undefined"&&typeof i!="undefined"&&n.replaceWith(i)}let t=ti(document);for(let[o,n]of ti(e))t.has(o)?t.delete(o):document.head.appendChild(n);for(let o of t.values()){let n=o.getAttribute("name");n!=="theme-color"&&n!=="color-scheme"&&o.remove()}let r=Se("container");return We(P("script",r)).pipe(v(o=>{let n=e.createElement("script");if(o.src){for(let i of o.getAttributeNames())n.setAttribute(i,o.getAttribute(i));return o.replaceWith(n),new j(i=>{n.onload=()=>i.complete()})}else return n.textContent=o.textContent,o.replaceWith(n),S}),Z(),ie(document))}function oi({location$:e,viewport$:t,progress$:r}){let o=xe();if(location.protocol==="file:")return S;let n=ur(o.base);I(document).subscribe(ri);let i=h(document.body,"click").pipe(He(n),v(([p,c])=>os(p,c)),pe()),a=h(window,"popstate").pipe(m(ye),pe());i.pipe(re(t)).subscribe(([p,{offset:c}])=>{history.replaceState(c,""),history.pushState(null,"",p)}),O(i,a).subscribe(e);let s=e.pipe(ee("pathname"),v(p=>fn(p,{progress$:r}).pipe(de(()=>(lt(p,!0),S)))),v(ri),v(ns),pe());return O(s.pipe(re(e,(p,c)=>c)),s.pipe(v(()=>e),ee("pathname"),v(()=>e),ee("hash")),e.pipe(K((p,c)=>p.pathname===c.pathname&&p.hash===c.hash),v(()=>i),w(()=>history.back()))).subscribe(p=>{var c,l;history.state!==null||!p.hash?window.scrollTo(0,(l=(c=history.state)==null?void 0:c.y)!=null?l:0):(history.scrollRestoration="auto",pn(p.hash),history.scrollRestoration="manual")}),e.subscribe(()=>{history.scrollRestoration="manual"}),h(window,"beforeunload").subscribe(()=>{history.scrollRestoration="auto"}),t.pipe(ee("offset"),_e(100)).subscribe(({offset:p})=>{history.replaceState(p,"")}),s}var ni=Mt(qr());function ii(e){let t=e.separator.split("|").map(n=>n.replace(/(\(\?[!=<][^)]+\))/g,"").length===0?"\uFFFD":n).join("|"),r=new RegExp(t,"img"),o=(n,i,a)=>`${i}${a}`;return n=>{n=n.replace(/[\s*+\-:~^]+/g," ").trim();let i=new RegExp(`(^|${e.separator}|)(${n.replace(/[|\\{}()[\]^$+*?.-]/g,"\\$&").replace(r,"|")})`,"img");return a=>(0,ni.default)(a).replace(i,o).replace(/<\/mark>(\s+)]*>/img,"$1")}}function jt(e){return e.type===1}function dr(e){return e.type===3}function ai(e,t){let r=yn(e);return O(I(location.protocol!=="file:"),ze("search")).pipe(Ae(o=>o),v(()=>t)).subscribe(({config:o,docs:n})=>r.next({type:0,data:{config:o,docs:n,options:{suggest:B("search.suggest")}}})),r}function si(e){var l;let{selectedVersionSitemap:t,selectedVersionBaseURL:r,currentLocation:o,currentBaseURL:n}=e,i=(l=Xr(n))==null?void 0:l.pathname;if(i===void 0)return;let a=ss(o.pathname,i);if(a===void 0)return;let s=ps(t.keys());if(!t.has(s))return;let p=Xr(a,s);if(!p||!t.has(p.href))return;let c=Xr(a,r);if(c)return c.hash=o.hash,c.search=o.search,c}function Xr(e,t){try{return new URL(e,t)}catch(r){return}}function ss(e,t){if(e.startsWith(t))return e.slice(t.length)}function cs(e,t){let r=Math.min(e.length,t.length),o;for(o=0;oS)),o=r.pipe(m(n=>{let[,i]=t.base.match(/([^/]+)\/?$/);return n.find(({version:a,aliases:s})=>a===i||s.includes(i))||n[0]}));r.pipe(m(n=>new Map(n.map(i=>[`${new URL(`../${i.version}/`,t.base)}`,i]))),v(n=>h(document.body,"click").pipe(b(i=>!i.metaKey&&!i.ctrlKey),re(o),v(([i,a])=>{if(i.target instanceof Element){let s=i.target.closest("a");if(s&&!s.target&&n.has(s.href)){let p=s.href;return!i.target.closest(".md-version")&&n.get(p)===a?S:(i.preventDefault(),I(new URL(p)))}}return S}),v(i=>ur(i).pipe(m(a=>{var s;return(s=si({selectedVersionSitemap:a,selectedVersionBaseURL:i,currentLocation:ye(),currentBaseURL:t.base}))!=null?s:i})))))).subscribe(n=>lt(n,!0)),z([r,o]).subscribe(([n,i])=>{R(".md-header__topic").appendChild(Cn(n,i))}),e.pipe(v(()=>o)).subscribe(n=>{var a;let i=__md_get("__outdated",sessionStorage);if(i===null){i=!0;let s=((a=t.version)==null?void 0:a.default)||"latest";Array.isArray(s)||(s=[s]);e:for(let p of s)for(let c of n.aliases.concat(n.version))if(new RegExp(p,"i").test(c)){i=!1;break e}__md_set("__outdated",i,sessionStorage)}if(i)for(let s of ae("outdated"))s.hidden=!1})}function ls(e,{worker$:t}){let{searchParams:r}=ye();r.has("q")&&(Je("search",!0),e.value=r.get("q"),e.focus(),ze("search").pipe(Ae(i=>!i)).subscribe(()=>{let i=ye();i.searchParams.delete("q"),history.replaceState({},"",`${i}`)}));let o=et(e),n=O(t.pipe(Ae(jt)),h(e,"keyup"),o).pipe(m(()=>e.value),K());return z([n,o]).pipe(m(([i,a])=>({value:i,focus:a})),G(1))}function pi(e,{worker$:t}){let r=new g,o=r.pipe(Z(),ie(!0));z([t.pipe(Ae(jt)),r],(i,a)=>a).pipe(ee("value")).subscribe(({value:i})=>t.next({type:2,data:i})),r.pipe(ee("focus")).subscribe(({focus:i})=>{i&&Je("search",i)}),h(e.form,"reset").pipe(W(o)).subscribe(()=>e.focus());let n=R("header [for=__search]");return h(n,"click").subscribe(()=>e.focus()),ls(e,{worker$:t}).pipe(w(i=>r.next(i)),_(()=>r.complete()),m(i=>$({ref:e},i)),G(1))}function li(e,{worker$:t,query$:r}){let o=new g,n=on(e.parentElement).pipe(b(Boolean)),i=e.parentElement,a=R(":scope > :first-child",e),s=R(":scope > :last-child",e);ze("search").subscribe(l=>s.setAttribute("role",l?"list":"presentation")),o.pipe(re(r),Wr(t.pipe(Ae(jt)))).subscribe(([{items:l},{value:f}])=>{switch(l.length){case 0:a.textContent=f.length?Ee("search.result.none"):Ee("search.result.placeholder");break;case 1:a.textContent=Ee("search.result.one");break;default:let u=sr(l.length);a.textContent=Ee("search.result.other",u)}});let p=o.pipe(w(()=>s.innerHTML=""),v(({items:l})=>O(I(...l.slice(0,10)),I(...l.slice(10)).pipe(Be(4),Vr(n),v(([f])=>f)))),m(Mn),pe());return p.subscribe(l=>s.appendChild(l)),p.pipe(ne(l=>{let f=fe("details",l);return typeof f=="undefined"?S:h(f,"toggle").pipe(W(o),m(()=>f))})).subscribe(l=>{l.open===!1&&l.offsetTop<=i.scrollTop&&i.scrollTo({top:l.offsetTop})}),t.pipe(b(dr),m(({data:l})=>l)).pipe(w(l=>o.next(l)),_(()=>o.complete()),m(l=>$({ref:e},l)))}function ms(e,{query$:t}){return t.pipe(m(({value:r})=>{let o=ye();return o.hash="",r=r.replace(/\s+/g,"+").replace(/&/g,"%26").replace(/=/g,"%3D"),o.search=`q=${r}`,{url:o}}))}function mi(e,t){let r=new g,o=r.pipe(Z(),ie(!0));return r.subscribe(({url:n})=>{e.setAttribute("data-clipboard-text",e.href),e.href=`${n}`}),h(e,"click").pipe(W(o)).subscribe(n=>n.preventDefault()),ms(e,t).pipe(w(n=>r.next(n)),_(()=>r.complete()),m(n=>$({ref:e},n)))}function fi(e,{worker$:t,keyboard$:r}){let o=new g,n=Se("search-query"),i=O(h(n,"keydown"),h(n,"focus")).pipe(ve(se),m(()=>n.value),K());return o.pipe(He(i),m(([{suggest:s},p])=>{let c=p.split(/([\s-]+)/);if(s!=null&&s.length&&c[c.length-1]){let l=s[s.length-1];l.startsWith(c[c.length-1])&&(c[c.length-1]=l)}else c.length=0;return c})).subscribe(s=>e.innerHTML=s.join("").replace(/\s/g," ")),r.pipe(b(({mode:s})=>s==="search")).subscribe(s=>{switch(s.type){case"ArrowRight":e.innerText.length&&n.selectionStart===n.value.length&&(n.value=e.innerText);break}}),t.pipe(b(dr),m(({data:s})=>s)).pipe(w(s=>o.next(s)),_(()=>o.complete()),m(()=>({ref:e})))}function ui(e,{index$:t,keyboard$:r}){let o=xe();try{let n=ai(o.search,t),i=Se("search-query",e),a=Se("search-result",e);h(e,"click").pipe(b(({target:p})=>p instanceof Element&&!!p.closest("a"))).subscribe(()=>Je("search",!1)),r.pipe(b(({mode:p})=>p==="search")).subscribe(p=>{let c=Ie();switch(p.type){case"Enter":if(c===i){let l=new Map;for(let f of P(":first-child [href]",a)){let u=f.firstElementChild;l.set(f,parseFloat(u.getAttribute("data-md-score")))}if(l.size){let[[f]]=[...l].sort(([,u],[,d])=>d-u);f.click()}p.claim()}break;case"Escape":case"Tab":Je("search",!1),i.blur();break;case"ArrowUp":case"ArrowDown":if(typeof c=="undefined")i.focus();else{let l=[i,...P(":not(details) > [href], summary, details[open] [href]",a)],f=Math.max(0,(Math.max(0,l.indexOf(c))+l.length+(p.type==="ArrowUp"?-1:1))%l.length);l[f].focus()}p.claim();break;default:i!==Ie()&&i.focus()}}),r.pipe(b(({mode:p})=>p==="global")).subscribe(p=>{switch(p.type){case"f":case"s":case"/":i.focus(),i.select(),p.claim();break}});let s=pi(i,{worker$:n});return O(s,li(a,{worker$:n,query$:s})).pipe(Re(...ae("search-share",e).map(p=>mi(p,{query$:s})),...ae("search-suggest",e).map(p=>fi(p,{worker$:n,keyboard$:r}))))}catch(n){return e.hidden=!0,Ye}}function di(e,{index$:t,location$:r}){return z([t,r.pipe(Q(ye()),b(o=>!!o.searchParams.get("h")))]).pipe(m(([o,n])=>ii(o.config)(n.searchParams.get("h"))),m(o=>{var a;let n=new Map,i=document.createNodeIterator(e,NodeFilter.SHOW_TEXT);for(let s=i.nextNode();s;s=i.nextNode())if((a=s.parentElement)!=null&&a.offsetHeight){let p=s.textContent,c=o(p);c.length>p.length&&n.set(s,c)}for(let[s,p]of n){let{childNodes:c}=x("span",null,p);s.replaceWith(...Array.from(c))}return{ref:e,nodes:n}}))}function fs(e,{viewport$:t,main$:r}){let o=e.closest(".md-grid"),n=o.offsetTop-o.parentElement.offsetTop;return z([r,t]).pipe(m(([{offset:i,height:a},{offset:{y:s}}])=>(a=a+Math.min(n,Math.max(0,s-i))-n,{height:a,locked:s>=i+n})),K((i,a)=>i.height===a.height&&i.locked===a.locked))}function Zr(e,o){var n=o,{header$:t}=n,r=so(n,["header$"]);let i=R(".md-sidebar__scrollwrap",e),{y:a}=Ve(i);return C(()=>{let s=new g,p=s.pipe(Z(),ie(!0)),c=s.pipe(Me(0,me));return c.pipe(re(t)).subscribe({next([{height:l},{height:f}]){i.style.height=`${l-2*a}px`,e.style.top=`${f}px`},complete(){i.style.height="",e.style.top=""}}),c.pipe(Ae()).subscribe(()=>{for(let l of P(".md-nav__link--active[href]",e)){if(!l.clientHeight)continue;let f=l.closest(".md-sidebar__scrollwrap");if(typeof f!="undefined"){let u=l.offsetTop-f.offsetTop,{height:d}=ce(f);f.scrollTo({top:u-d/2})}}}),ue(P("label[tabindex]",e)).pipe(ne(l=>h(l,"click").pipe(ve(se),m(()=>l),W(p)))).subscribe(l=>{let f=R(`[id="${l.htmlFor}"]`);R(`[aria-labelledby="${l.id}"]`).setAttribute("aria-expanded",`${f.checked}`)}),fs(e,r).pipe(w(l=>s.next(l)),_(()=>s.complete()),m(l=>$({ref:e},l)))})}function hi(e,t){if(typeof t!="undefined"){let r=`https://api.github.com/repos/${e}/${t}`;return st(je(`${r}/releases/latest`).pipe(de(()=>S),m(o=>({version:o.tag_name})),De({})),je(r).pipe(de(()=>S),m(o=>({stars:o.stargazers_count,forks:o.forks_count})),De({}))).pipe(m(([o,n])=>$($({},o),n)))}else{let r=`https://api.github.com/users/${e}`;return je(r).pipe(m(o=>({repositories:o.public_repos})),De({}))}}function bi(e,t){let r=`https://${e}/api/v4/projects/${encodeURIComponent(t)}`;return st(je(`${r}/releases/permalink/latest`).pipe(de(()=>S),m(({tag_name:o})=>({version:o})),De({})),je(r).pipe(de(()=>S),m(({star_count:o,forks_count:n})=>({stars:o,forks:n})),De({}))).pipe(m(([o,n])=>$($({},o),n)))}function vi(e){let t=e.match(/^.+github\.com\/([^/]+)\/?([^/]+)?/i);if(t){let[,r,o]=t;return hi(r,o)}if(t=e.match(/^.+?([^/]*gitlab[^/]+)\/(.+?)\/?$/i),t){let[,r,o]=t;return bi(r,o)}return S}var us;function ds(e){return us||(us=C(()=>{let t=__md_get("__source",sessionStorage);if(t)return I(t);if(ae("consent").length){let o=__md_get("__consent");if(!(o&&o.github))return S}return vi(e.href).pipe(w(o=>__md_set("__source",o,sessionStorage)))}).pipe(de(()=>S),b(t=>Object.keys(t).length>0),m(t=>({facts:t})),G(1)))}function gi(e){let t=R(":scope > :last-child",e);return C(()=>{let r=new g;return r.subscribe(({facts:o})=>{t.appendChild(_n(o)),t.classList.add("md-source__repository--active")}),ds(e).pipe(w(o=>r.next(o)),_(()=>r.complete()),m(o=>$({ref:e},o)))})}function hs(e,{viewport$:t,header$:r}){return ge(document.body).pipe(v(()=>mr(e,{header$:r,viewport$:t})),m(({offset:{y:o}})=>({hidden:o>=10})),ee("hidden"))}function yi(e,t){return C(()=>{let r=new g;return r.subscribe({next({hidden:o}){e.hidden=o},complete(){e.hidden=!1}}),(B("navigation.tabs.sticky")?I({hidden:!1}):hs(e,t)).pipe(w(o=>r.next(o)),_(()=>r.complete()),m(o=>$({ref:e},o)))})}function bs(e,{viewport$:t,header$:r}){let o=new Map,n=P(".md-nav__link",e);for(let s of n){let p=decodeURIComponent(s.hash.substring(1)),c=fe(`[id="${p}"]`);typeof c!="undefined"&&o.set(s,c)}let i=r.pipe(ee("height"),m(({height:s})=>{let p=Se("main"),c=R(":scope > :first-child",p);return s+.8*(c.offsetTop-p.offsetTop)}),pe());return ge(document.body).pipe(ee("height"),v(s=>C(()=>{let p=[];return I([...o].reduce((c,[l,f])=>{for(;p.length&&o.get(p[p.length-1]).tagName>=f.tagName;)p.pop();let u=f.offsetTop;for(;!u&&f.parentElement;)f=f.parentElement,u=f.offsetTop;let d=f.offsetParent;for(;d;d=d.offsetParent)u+=d.offsetTop;return c.set([...p=[...p,l]].reverse(),u)},new Map))}).pipe(m(p=>new Map([...p].sort(([,c],[,l])=>c-l))),He(i),v(([p,c])=>t.pipe(Fr(([l,f],{offset:{y:u},size:d})=>{let y=u+d.height>=Math.floor(s.height);for(;f.length;){let[,L]=f[0];if(L-c=u&&!y)f=[l.pop(),...f];else break}return[l,f]},[[],[...p]]),K((l,f)=>l[0]===f[0]&&l[1]===f[1])))))).pipe(m(([s,p])=>({prev:s.map(([c])=>c),next:p.map(([c])=>c)})),Q({prev:[],next:[]}),Be(2,1),m(([s,p])=>s.prev.length{let i=new g,a=i.pipe(Z(),ie(!0));if(i.subscribe(({prev:s,next:p})=>{for(let[c]of p)c.classList.remove("md-nav__link--passed"),c.classList.remove("md-nav__link--active");for(let[c,[l]]of s.entries())l.classList.add("md-nav__link--passed"),l.classList.toggle("md-nav__link--active",c===s.length-1)}),B("toc.follow")){let s=O(t.pipe(_e(1),m(()=>{})),t.pipe(_e(250),m(()=>"smooth")));i.pipe(b(({prev:p})=>p.length>0),He(o.pipe(ve(se))),re(s)).subscribe(([[{prev:p}],c])=>{let[l]=p[p.length-1];if(l.offsetHeight){let f=cr(l);if(typeof f!="undefined"){let u=l.offsetTop-f.offsetTop,{height:d}=ce(f);f.scrollTo({top:u-d/2,behavior:c})}}})}return B("navigation.tracking")&&t.pipe(W(a),ee("offset"),_e(250),Ce(1),W(n.pipe(Ce(1))),ct({delay:250}),re(i)).subscribe(([,{prev:s}])=>{let p=ye(),c=s[s.length-1];if(c&&c.length){let[l]=c,{hash:f}=new URL(l.href);p.hash!==f&&(p.hash=f,history.replaceState({},"",`${p}`))}else p.hash="",history.replaceState({},"",`${p}`)}),bs(e,{viewport$:t,header$:r}).pipe(w(s=>i.next(s)),_(()=>i.complete()),m(s=>$({ref:e},s)))})}function vs(e,{viewport$:t,main$:r,target$:o}){let n=t.pipe(m(({offset:{y:a}})=>a),Be(2,1),m(([a,s])=>a>s&&s>0),K()),i=r.pipe(m(({active:a})=>a));return z([i,n]).pipe(m(([a,s])=>!(a&&s)),K(),W(o.pipe(Ce(1))),ie(!0),ct({delay:250}),m(a=>({hidden:a})))}function Ei(e,{viewport$:t,header$:r,main$:o,target$:n}){let i=new g,a=i.pipe(Z(),ie(!0));return i.subscribe({next({hidden:s}){e.hidden=s,s?(e.setAttribute("tabindex","-1"),e.blur()):e.removeAttribute("tabindex")},complete(){e.style.top="",e.hidden=!0,e.removeAttribute("tabindex")}}),r.pipe(W(a),ee("height")).subscribe(({height:s})=>{e.style.top=`${s+16}px`}),h(e,"click").subscribe(s=>{s.preventDefault(),window.scrollTo({top:0})}),vs(e,{viewport$:t,main$:o,target$:n}).pipe(w(s=>i.next(s)),_(()=>i.complete()),m(s=>$({ref:e},s)))}function wi({document$:e,viewport$:t}){e.pipe(v(()=>P(".md-ellipsis")),ne(r=>tt(r).pipe(W(e.pipe(Ce(1))),b(o=>o),m(()=>r),Te(1))),b(r=>r.offsetWidth{let o=r.innerText,n=r.closest("a")||r;return n.title=o,B("content.tooltips")?mt(n,{viewport$:t}).pipe(W(e.pipe(Ce(1))),_(()=>n.removeAttribute("title"))):S})).subscribe(),B("content.tooltips")&&e.pipe(v(()=>P(".md-status")),ne(r=>mt(r,{viewport$:t}))).subscribe()}function Ti({document$:e,tablet$:t}){e.pipe(v(()=>P(".md-toggle--indeterminate")),w(r=>{r.indeterminate=!0,r.checked=!1}),ne(r=>h(r,"change").pipe(Dr(()=>r.classList.contains("md-toggle--indeterminate")),m(()=>r))),re(t)).subscribe(([r,o])=>{r.classList.remove("md-toggle--indeterminate"),o&&(r.checked=!1)})}function gs(){return/(iPad|iPhone|iPod)/.test(navigator.userAgent)}function Si({document$:e}){e.pipe(v(()=>P("[data-md-scrollfix]")),w(t=>t.removeAttribute("data-md-scrollfix")),b(gs),ne(t=>h(t,"touchstart").pipe(m(()=>t)))).subscribe(t=>{let r=t.scrollTop;r===0?t.scrollTop=1:r+t.offsetHeight===t.scrollHeight&&(t.scrollTop=r-1)})}function Oi({viewport$:e,tablet$:t}){z([ze("search"),t]).pipe(m(([r,o])=>r&&!o),v(r=>I(r).pipe(Ge(r?400:100))),re(e)).subscribe(([r,{offset:{y:o}}])=>{if(r)document.body.setAttribute("data-md-scrolllock",""),document.body.style.top=`-${o}px`;else{let n=-1*parseInt(document.body.style.top,10);document.body.removeAttribute("data-md-scrolllock"),document.body.style.top="",n&&window.scrollTo(0,n)}})}Object.entries||(Object.entries=function(e){let t=[];for(let r of Object.keys(e))t.push([r,e[r]]);return t});Object.values||(Object.values=function(e){let t=[];for(let r of Object.keys(e))t.push(e[r]);return t});typeof Element!="undefined"&&(Element.prototype.scrollTo||(Element.prototype.scrollTo=function(e,t){typeof e=="object"?(this.scrollLeft=e.left,this.scrollTop=e.top):(this.scrollLeft=e,this.scrollTop=t)}),Element.prototype.replaceWith||(Element.prototype.replaceWith=function(...e){let t=this.parentNode;if(t){e.length===0&&t.removeChild(this);for(let r=e.length-1;r>=0;r--){let o=e[r];typeof o=="string"?o=document.createTextNode(o):o.parentNode&&o.parentNode.removeChild(o),r?t.insertBefore(this.previousSibling,o):t.replaceChild(o,this)}}}));function ys(){return location.protocol==="file:"?Tt(`${new URL("search/search_index.js",eo.base)}`).pipe(m(()=>__index),G(1)):je(new URL("search/search_index.json",eo.base))}document.documentElement.classList.remove("no-js");document.documentElement.classList.add("js");var ot=Go(),Ut=sn(),Lt=ln(Ut),to=an(),Oe=gn(),hr=Pt("(min-width: 960px)"),Mi=Pt("(min-width: 1220px)"),_i=mn(),eo=xe(),Ai=document.forms.namedItem("search")?ys():Ye,ro=new g;Zn({alert$:ro});var oo=new g;B("navigation.instant")&&oi({location$:Ut,viewport$:Oe,progress$:oo}).subscribe(ot);var Li;((Li=eo.version)==null?void 0:Li.provider)==="mike"&&ci({document$:ot});O(Ut,Lt).pipe(Ge(125)).subscribe(()=>{Je("drawer",!1),Je("search",!1)});to.pipe(b(({mode:e})=>e==="global")).subscribe(e=>{switch(e.type){case"p":case",":let t=fe("link[rel=prev]");typeof t!="undefined"&<(t);break;case"n":case".":let r=fe("link[rel=next]");typeof r!="undefined"&<(r);break;case"Enter":let o=Ie();o instanceof HTMLLabelElement&&o.click()}});wi({viewport$:Oe,document$:ot});Ti({document$:ot,tablet$:hr});Si({document$:ot});Oi({viewport$:Oe,tablet$:hr});var rt=Kn(Se("header"),{viewport$:Oe}),Ft=ot.pipe(m(()=>Se("main")),v(e=>Gn(e,{viewport$:Oe,header$:rt})),G(1)),xs=O(...ae("consent").map(e=>En(e,{target$:Lt})),...ae("dialog").map(e=>qn(e,{alert$:ro})),...ae("header").map(e=>Yn(e,{viewport$:Oe,header$:rt,main$:Ft})),...ae("palette").map(e=>Jn(e)),...ae("progress").map(e=>Xn(e,{progress$:oo})),...ae("search").map(e=>ui(e,{index$:Ai,keyboard$:to})),...ae("source").map(e=>gi(e))),Es=C(()=>O(...ae("announce").map(e=>xn(e)),...ae("content").map(e=>zn(e,{viewport$:Oe,target$:Lt,print$:_i})),...ae("content").map(e=>B("search.highlight")?di(e,{index$:Ai,location$:Ut}):S),...ae("header-title").map(e=>Bn(e,{viewport$:Oe,header$:rt})),...ae("sidebar").map(e=>e.getAttribute("data-md-type")==="navigation"?Nr(Mi,()=>Zr(e,{viewport$:Oe,header$:rt,main$:Ft})):Nr(hr,()=>Zr(e,{viewport$:Oe,header$:rt,main$:Ft}))),...ae("tabs").map(e=>yi(e,{viewport$:Oe,header$:rt})),...ae("toc").map(e=>xi(e,{viewport$:Oe,header$:rt,main$:Ft,target$:Lt})),...ae("top").map(e=>Ei(e,{viewport$:Oe,header$:rt,main$:Ft,target$:Lt})))),Ci=ot.pipe(v(()=>Es),Re(xs),G(1));Ci.subscribe();window.document$=ot;window.location$=Ut;window.target$=Lt;window.keyboard$=to;window.viewport$=Oe;window.tablet$=hr;window.screen$=Mi;window.print$=_i;window.alert$=ro;window.progress$=oo;window.component$=Ci;})(); -//# sourceMappingURL=bundle.525ec568.min.js.map - diff --git a/2.2.2/assets/javascripts/bundle.525ec568.min.js.map b/2.2.2/assets/javascripts/bundle.525ec568.min.js.map deleted file mode 100644 index ef5d8d3..0000000 --- a/2.2.2/assets/javascripts/bundle.525ec568.min.js.map +++ /dev/null @@ -1,7 +0,0 @@ -{ - "version": 3, - "sources": ["node_modules/focus-visible/dist/focus-visible.js", "node_modules/escape-html/index.js", "node_modules/clipboard/dist/clipboard.js", "src/templates/assets/javascripts/bundle.ts", "node_modules/tslib/tslib.es6.mjs", "node_modules/rxjs/src/internal/util/isFunction.ts", "node_modules/rxjs/src/internal/util/createErrorClass.ts", "node_modules/rxjs/src/internal/util/UnsubscriptionError.ts", "node_modules/rxjs/src/internal/util/arrRemove.ts", "node_modules/rxjs/src/internal/Subscription.ts", "node_modules/rxjs/src/internal/config.ts", "node_modules/rxjs/src/internal/scheduler/timeoutProvider.ts", "node_modules/rxjs/src/internal/util/reportUnhandledError.ts", "node_modules/rxjs/src/internal/util/noop.ts", "node_modules/rxjs/src/internal/NotificationFactories.ts", "node_modules/rxjs/src/internal/util/errorContext.ts", "node_modules/rxjs/src/internal/Subscriber.ts", "node_modules/rxjs/src/internal/symbol/observable.ts", "node_modules/rxjs/src/internal/util/identity.ts", "node_modules/rxjs/src/internal/util/pipe.ts", "node_modules/rxjs/src/internal/Observable.ts", "node_modules/rxjs/src/internal/util/lift.ts", "node_modules/rxjs/src/internal/operators/OperatorSubscriber.ts", "node_modules/rxjs/src/internal/scheduler/animationFrameProvider.ts", "node_modules/rxjs/src/internal/util/ObjectUnsubscribedError.ts", "node_modules/rxjs/src/internal/Subject.ts", "node_modules/rxjs/src/internal/BehaviorSubject.ts", "node_modules/rxjs/src/internal/scheduler/dateTimestampProvider.ts", "node_modules/rxjs/src/internal/ReplaySubject.ts", "node_modules/rxjs/src/internal/scheduler/Action.ts", "node_modules/rxjs/src/internal/scheduler/intervalProvider.ts", "node_modules/rxjs/src/internal/scheduler/AsyncAction.ts", "node_modules/rxjs/src/internal/Scheduler.ts", "node_modules/rxjs/src/internal/scheduler/AsyncScheduler.ts", "node_modules/rxjs/src/internal/scheduler/async.ts", "node_modules/rxjs/src/internal/scheduler/QueueAction.ts", "node_modules/rxjs/src/internal/scheduler/QueueScheduler.ts", "node_modules/rxjs/src/internal/scheduler/queue.ts", "node_modules/rxjs/src/internal/scheduler/AnimationFrameAction.ts", "node_modules/rxjs/src/internal/scheduler/AnimationFrameScheduler.ts", "node_modules/rxjs/src/internal/scheduler/animationFrame.ts", "node_modules/rxjs/src/internal/observable/empty.ts", "node_modules/rxjs/src/internal/util/isScheduler.ts", "node_modules/rxjs/src/internal/util/args.ts", "node_modules/rxjs/src/internal/util/isArrayLike.ts", "node_modules/rxjs/src/internal/util/isPromise.ts", "node_modules/rxjs/src/internal/util/isInteropObservable.ts", "node_modules/rxjs/src/internal/util/isAsyncIterable.ts", "node_modules/rxjs/src/internal/util/throwUnobservableError.ts", "node_modules/rxjs/src/internal/symbol/iterator.ts", "node_modules/rxjs/src/internal/util/isIterable.ts", "node_modules/rxjs/src/internal/util/isReadableStreamLike.ts", "node_modules/rxjs/src/internal/observable/innerFrom.ts", "node_modules/rxjs/src/internal/util/executeSchedule.ts", "node_modules/rxjs/src/internal/operators/observeOn.ts", "node_modules/rxjs/src/internal/operators/subscribeOn.ts", "node_modules/rxjs/src/internal/scheduled/scheduleObservable.ts", "node_modules/rxjs/src/internal/scheduled/schedulePromise.ts", "node_modules/rxjs/src/internal/scheduled/scheduleArray.ts", "node_modules/rxjs/src/internal/scheduled/scheduleIterable.ts", "node_modules/rxjs/src/internal/scheduled/scheduleAsyncIterable.ts", "node_modules/rxjs/src/internal/scheduled/scheduleReadableStreamLike.ts", "node_modules/rxjs/src/internal/scheduled/scheduled.ts", "node_modules/rxjs/src/internal/observable/from.ts", "node_modules/rxjs/src/internal/observable/of.ts", "node_modules/rxjs/src/internal/observable/throwError.ts", "node_modules/rxjs/src/internal/util/EmptyError.ts", "node_modules/rxjs/src/internal/util/isDate.ts", "node_modules/rxjs/src/internal/operators/map.ts", "node_modules/rxjs/src/internal/util/mapOneOrManyArgs.ts", "node_modules/rxjs/src/internal/util/argsArgArrayOrObject.ts", "node_modules/rxjs/src/internal/util/createObject.ts", "node_modules/rxjs/src/internal/observable/combineLatest.ts", "node_modules/rxjs/src/internal/operators/mergeInternals.ts", "node_modules/rxjs/src/internal/operators/mergeMap.ts", "node_modules/rxjs/src/internal/operators/mergeAll.ts", "node_modules/rxjs/src/internal/operators/concatAll.ts", "node_modules/rxjs/src/internal/observable/concat.ts", "node_modules/rxjs/src/internal/observable/defer.ts", "node_modules/rxjs/src/internal/observable/fromEvent.ts", "node_modules/rxjs/src/internal/observable/fromEventPattern.ts", "node_modules/rxjs/src/internal/observable/timer.ts", "node_modules/rxjs/src/internal/observable/merge.ts", "node_modules/rxjs/src/internal/observable/never.ts", "node_modules/rxjs/src/internal/util/argsOrArgArray.ts", "node_modules/rxjs/src/internal/operators/filter.ts", "node_modules/rxjs/src/internal/observable/zip.ts", "node_modules/rxjs/src/internal/operators/audit.ts", "node_modules/rxjs/src/internal/operators/auditTime.ts", "node_modules/rxjs/src/internal/operators/bufferCount.ts", "node_modules/rxjs/src/internal/operators/catchError.ts", "node_modules/rxjs/src/internal/operators/scanInternals.ts", "node_modules/rxjs/src/internal/operators/combineLatest.ts", "node_modules/rxjs/src/internal/operators/combineLatestWith.ts", "node_modules/rxjs/src/internal/operators/debounce.ts", "node_modules/rxjs/src/internal/operators/debounceTime.ts", "node_modules/rxjs/src/internal/operators/defaultIfEmpty.ts", "node_modules/rxjs/src/internal/operators/take.ts", "node_modules/rxjs/src/internal/operators/ignoreElements.ts", "node_modules/rxjs/src/internal/operators/mapTo.ts", "node_modules/rxjs/src/internal/operators/delayWhen.ts", "node_modules/rxjs/src/internal/operators/delay.ts", "node_modules/rxjs/src/internal/operators/distinctUntilChanged.ts", "node_modules/rxjs/src/internal/operators/distinctUntilKeyChanged.ts", "node_modules/rxjs/src/internal/operators/throwIfEmpty.ts", "node_modules/rxjs/src/internal/operators/endWith.ts", "node_modules/rxjs/src/internal/operators/finalize.ts", "node_modules/rxjs/src/internal/operators/first.ts", "node_modules/rxjs/src/internal/operators/takeLast.ts", "node_modules/rxjs/src/internal/operators/merge.ts", "node_modules/rxjs/src/internal/operators/mergeWith.ts", "node_modules/rxjs/src/internal/operators/repeat.ts", "node_modules/rxjs/src/internal/operators/scan.ts", "node_modules/rxjs/src/internal/operators/share.ts", "node_modules/rxjs/src/internal/operators/shareReplay.ts", "node_modules/rxjs/src/internal/operators/skip.ts", "node_modules/rxjs/src/internal/operators/skipUntil.ts", "node_modules/rxjs/src/internal/operators/startWith.ts", "node_modules/rxjs/src/internal/operators/switchMap.ts", "node_modules/rxjs/src/internal/operators/takeUntil.ts", "node_modules/rxjs/src/internal/operators/takeWhile.ts", "node_modules/rxjs/src/internal/operators/tap.ts", "node_modules/rxjs/src/internal/operators/throttle.ts", "node_modules/rxjs/src/internal/operators/throttleTime.ts", "node_modules/rxjs/src/internal/operators/withLatestFrom.ts", "node_modules/rxjs/src/internal/operators/zip.ts", "node_modules/rxjs/src/internal/operators/zipWith.ts", "src/templates/assets/javascripts/browser/document/index.ts", "src/templates/assets/javascripts/browser/element/_/index.ts", "src/templates/assets/javascripts/browser/element/focus/index.ts", "src/templates/assets/javascripts/browser/element/hover/index.ts", "src/templates/assets/javascripts/utilities/h/index.ts", "src/templates/assets/javascripts/utilities/round/index.ts", "src/templates/assets/javascripts/browser/script/index.ts", "src/templates/assets/javascripts/browser/element/size/_/index.ts", "src/templates/assets/javascripts/browser/element/size/content/index.ts", "src/templates/assets/javascripts/browser/element/offset/_/index.ts", "src/templates/assets/javascripts/browser/element/offset/content/index.ts", "src/templates/assets/javascripts/browser/element/visibility/index.ts", "src/templates/assets/javascripts/browser/toggle/index.ts", "src/templates/assets/javascripts/browser/keyboard/index.ts", "src/templates/assets/javascripts/browser/location/_/index.ts", "src/templates/assets/javascripts/browser/location/hash/index.ts", "src/templates/assets/javascripts/browser/media/index.ts", "src/templates/assets/javascripts/browser/request/index.ts", "src/templates/assets/javascripts/browser/viewport/offset/index.ts", "src/templates/assets/javascripts/browser/viewport/size/index.ts", "src/templates/assets/javascripts/browser/viewport/_/index.ts", "src/templates/assets/javascripts/browser/viewport/at/index.ts", "src/templates/assets/javascripts/browser/worker/index.ts", "src/templates/assets/javascripts/_/index.ts", "src/templates/assets/javascripts/components/_/index.ts", "src/templates/assets/javascripts/components/announce/index.ts", "src/templates/assets/javascripts/components/consent/index.ts", "src/templates/assets/javascripts/templates/tooltip/index.tsx", "src/templates/assets/javascripts/templates/annotation/index.tsx", "src/templates/assets/javascripts/templates/clipboard/index.tsx", "src/templates/assets/javascripts/templates/search/index.tsx", "src/templates/assets/javascripts/templates/source/index.tsx", "src/templates/assets/javascripts/templates/tabbed/index.tsx", "src/templates/assets/javascripts/templates/table/index.tsx", "src/templates/assets/javascripts/templates/version/index.tsx", "src/templates/assets/javascripts/components/tooltip2/index.ts", "src/templates/assets/javascripts/components/content/annotation/_/index.ts", "src/templates/assets/javascripts/components/content/annotation/list/index.ts", "src/templates/assets/javascripts/components/content/annotation/block/index.ts", "src/templates/assets/javascripts/components/content/code/_/index.ts", "src/templates/assets/javascripts/components/content/details/index.ts", "src/templates/assets/javascripts/components/content/mermaid/index.css", "src/templates/assets/javascripts/components/content/mermaid/index.ts", "src/templates/assets/javascripts/components/content/table/index.ts", "src/templates/assets/javascripts/components/content/tabs/index.ts", "src/templates/assets/javascripts/components/content/_/index.ts", "src/templates/assets/javascripts/components/dialog/index.ts", "src/templates/assets/javascripts/components/tooltip/index.ts", "src/templates/assets/javascripts/components/header/_/index.ts", "src/templates/assets/javascripts/components/header/title/index.ts", "src/templates/assets/javascripts/components/main/index.ts", "src/templates/assets/javascripts/components/palette/index.ts", "src/templates/assets/javascripts/components/progress/index.ts", "src/templates/assets/javascripts/integrations/clipboard/index.ts", "src/templates/assets/javascripts/integrations/sitemap/index.ts", "src/templates/assets/javascripts/integrations/instant/index.ts", "src/templates/assets/javascripts/integrations/search/highlighter/index.ts", "src/templates/assets/javascripts/integrations/search/worker/message/index.ts", "src/templates/assets/javascripts/integrations/search/worker/_/index.ts", "src/templates/assets/javascripts/integrations/version/findurl/index.ts", "src/templates/assets/javascripts/integrations/version/index.ts", "src/templates/assets/javascripts/components/search/query/index.ts", "src/templates/assets/javascripts/components/search/result/index.ts", "src/templates/assets/javascripts/components/search/share/index.ts", "src/templates/assets/javascripts/components/search/suggest/index.ts", "src/templates/assets/javascripts/components/search/_/index.ts", "src/templates/assets/javascripts/components/search/highlight/index.ts", "src/templates/assets/javascripts/components/sidebar/index.ts", "src/templates/assets/javascripts/components/source/facts/github/index.ts", "src/templates/assets/javascripts/components/source/facts/gitlab/index.ts", "src/templates/assets/javascripts/components/source/facts/_/index.ts", "src/templates/assets/javascripts/components/source/_/index.ts", "src/templates/assets/javascripts/components/tabs/index.ts", "src/templates/assets/javascripts/components/toc/index.ts", "src/templates/assets/javascripts/components/top/index.ts", "src/templates/assets/javascripts/patches/ellipsis/index.ts", "src/templates/assets/javascripts/patches/indeterminate/index.ts", "src/templates/assets/javascripts/patches/scrollfix/index.ts", "src/templates/assets/javascripts/patches/scrolllock/index.ts", "src/templates/assets/javascripts/polyfills/index.ts"], - "sourcesContent": ["(function (global, factory) {\n typeof exports === 'object' && typeof module !== 'undefined' ? factory() :\n typeof define === 'function' && define.amd ? define(factory) :\n (factory());\n}(this, (function () { 'use strict';\n\n /**\n * Applies the :focus-visible polyfill at the given scope.\n * A scope in this case is either the top-level Document or a Shadow Root.\n *\n * @param {(Document|ShadowRoot)} scope\n * @see https://github.com/WICG/focus-visible\n */\n function applyFocusVisiblePolyfill(scope) {\n var hadKeyboardEvent = true;\n var hadFocusVisibleRecently = false;\n var hadFocusVisibleRecentlyTimeout = null;\n\n var inputTypesAllowlist = {\n text: true,\n search: true,\n url: true,\n tel: true,\n email: true,\n password: true,\n number: true,\n date: true,\n month: true,\n week: true,\n time: true,\n datetime: true,\n 'datetime-local': true\n };\n\n /**\n * Helper function for legacy browsers and iframes which sometimes focus\n * elements like document, body, and non-interactive SVG.\n * @param {Element} el\n */\n function isValidFocusTarget(el) {\n if (\n el &&\n el !== document &&\n el.nodeName !== 'HTML' &&\n el.nodeName !== 'BODY' &&\n 'classList' in el &&\n 'contains' in el.classList\n ) {\n return true;\n }\n return false;\n }\n\n /**\n * Computes whether the given element should automatically trigger the\n * `focus-visible` class being added, i.e. whether it should always match\n * `:focus-visible` when focused.\n * @param {Element} el\n * @return {boolean}\n */\n function focusTriggersKeyboardModality(el) {\n var type = el.type;\n var tagName = el.tagName;\n\n if (tagName === 'INPUT' && inputTypesAllowlist[type] && !el.readOnly) {\n return true;\n }\n\n if (tagName === 'TEXTAREA' && !el.readOnly) {\n return true;\n }\n\n if (el.isContentEditable) {\n return true;\n }\n\n return false;\n }\n\n /**\n * Add the `focus-visible` class to the given element if it was not added by\n * the author.\n * @param {Element} el\n */\n function addFocusVisibleClass(el) {\n if (el.classList.contains('focus-visible')) {\n return;\n }\n el.classList.add('focus-visible');\n el.setAttribute('data-focus-visible-added', '');\n }\n\n /**\n * Remove the `focus-visible` class from the given element if it was not\n * originally added by the author.\n * @param {Element} el\n */\n function removeFocusVisibleClass(el) {\n if (!el.hasAttribute('data-focus-visible-added')) {\n return;\n }\n el.classList.remove('focus-visible');\n el.removeAttribute('data-focus-visible-added');\n }\n\n /**\n * If the most recent user interaction was via the keyboard;\n * and the key press did not include a meta, alt/option, or control key;\n * then the modality is keyboard. Otherwise, the modality is not keyboard.\n * Apply `focus-visible` to any current active element and keep track\n * of our keyboard modality state with `hadKeyboardEvent`.\n * @param {KeyboardEvent} e\n */\n function onKeyDown(e) {\n if (e.metaKey || e.altKey || e.ctrlKey) {\n return;\n }\n\n if (isValidFocusTarget(scope.activeElement)) {\n addFocusVisibleClass(scope.activeElement);\n }\n\n hadKeyboardEvent = true;\n }\n\n /**\n * If at any point a user clicks with a pointing device, ensure that we change\n * the modality away from keyboard.\n * This avoids the situation where a user presses a key on an already focused\n * element, and then clicks on a different element, focusing it with a\n * pointing device, while we still think we're in keyboard modality.\n * @param {Event} e\n */\n function onPointerDown(e) {\n hadKeyboardEvent = false;\n }\n\n /**\n * On `focus`, add the `focus-visible` class to the target if:\n * - the target received focus as a result of keyboard navigation, or\n * - the event target is an element that will likely require interaction\n * via the keyboard (e.g. a text box)\n * @param {Event} e\n */\n function onFocus(e) {\n // Prevent IE from focusing the document or HTML element.\n if (!isValidFocusTarget(e.target)) {\n return;\n }\n\n if (hadKeyboardEvent || focusTriggersKeyboardModality(e.target)) {\n addFocusVisibleClass(e.target);\n }\n }\n\n /**\n * On `blur`, remove the `focus-visible` class from the target.\n * @param {Event} e\n */\n function onBlur(e) {\n if (!isValidFocusTarget(e.target)) {\n return;\n }\n\n if (\n e.target.classList.contains('focus-visible') ||\n e.target.hasAttribute('data-focus-visible-added')\n ) {\n // To detect a tab/window switch, we look for a blur event followed\n // rapidly by a visibility change.\n // If we don't see a visibility change within 100ms, it's probably a\n // regular focus change.\n hadFocusVisibleRecently = true;\n window.clearTimeout(hadFocusVisibleRecentlyTimeout);\n hadFocusVisibleRecentlyTimeout = window.setTimeout(function() {\n hadFocusVisibleRecently = false;\n }, 100);\n removeFocusVisibleClass(e.target);\n }\n }\n\n /**\n * If the user changes tabs, keep track of whether or not the previously\n * focused element had .focus-visible.\n * @param {Event} e\n */\n function onVisibilityChange(e) {\n if (document.visibilityState === 'hidden') {\n // If the tab becomes active again, the browser will handle calling focus\n // on the element (Safari actually calls it twice).\n // If this tab change caused a blur on an element with focus-visible,\n // re-apply the class when the user switches back to the tab.\n if (hadFocusVisibleRecently) {\n hadKeyboardEvent = true;\n }\n addInitialPointerMoveListeners();\n }\n }\n\n /**\n * Add a group of listeners to detect usage of any pointing devices.\n * These listeners will be added when the polyfill first loads, and anytime\n * the window is blurred, so that they are active when the window regains\n * focus.\n */\n function addInitialPointerMoveListeners() {\n document.addEventListener('mousemove', onInitialPointerMove);\n document.addEventListener('mousedown', onInitialPointerMove);\n document.addEventListener('mouseup', onInitialPointerMove);\n document.addEventListener('pointermove', onInitialPointerMove);\n document.addEventListener('pointerdown', onInitialPointerMove);\n document.addEventListener('pointerup', onInitialPointerMove);\n document.addEventListener('touchmove', onInitialPointerMove);\n document.addEventListener('touchstart', onInitialPointerMove);\n document.addEventListener('touchend', onInitialPointerMove);\n }\n\n function removeInitialPointerMoveListeners() {\n document.removeEventListener('mousemove', onInitialPointerMove);\n document.removeEventListener('mousedown', onInitialPointerMove);\n document.removeEventListener('mouseup', onInitialPointerMove);\n document.removeEventListener('pointermove', onInitialPointerMove);\n document.removeEventListener('pointerdown', onInitialPointerMove);\n document.removeEventListener('pointerup', onInitialPointerMove);\n document.removeEventListener('touchmove', onInitialPointerMove);\n document.removeEventListener('touchstart', onInitialPointerMove);\n document.removeEventListener('touchend', onInitialPointerMove);\n }\n\n /**\n * When the polfyill first loads, assume the user is in keyboard modality.\n * If any event is received from a pointing device (e.g. mouse, pointer,\n * touch), turn off keyboard modality.\n * This accounts for situations where focus enters the page from the URL bar.\n * @param {Event} e\n */\n function onInitialPointerMove(e) {\n // Work around a Safari quirk that fires a mousemove on whenever the\n // window blurs, even if you're tabbing out of the page. \u00AF\\_(\u30C4)_/\u00AF\n if (e.target.nodeName && e.target.nodeName.toLowerCase() === 'html') {\n return;\n }\n\n hadKeyboardEvent = false;\n removeInitialPointerMoveListeners();\n }\n\n // For some kinds of state, we are interested in changes at the global scope\n // only. For example, global pointer input, global key presses and global\n // visibility change should affect the state at every scope:\n document.addEventListener('keydown', onKeyDown, true);\n document.addEventListener('mousedown', onPointerDown, true);\n document.addEventListener('pointerdown', onPointerDown, true);\n document.addEventListener('touchstart', onPointerDown, true);\n document.addEventListener('visibilitychange', onVisibilityChange, true);\n\n addInitialPointerMoveListeners();\n\n // For focus and blur, we specifically care about state changes in the local\n // scope. This is because focus / blur events that originate from within a\n // shadow root are not re-dispatched from the host element if it was already\n // the active element in its own scope:\n scope.addEventListener('focus', onFocus, true);\n scope.addEventListener('blur', onBlur, true);\n\n // We detect that a node is a ShadowRoot by ensuring that it is a\n // DocumentFragment and also has a host property. This check covers native\n // implementation and polyfill implementation transparently. If we only cared\n // about the native implementation, we could just check if the scope was\n // an instance of a ShadowRoot.\n if (scope.nodeType === Node.DOCUMENT_FRAGMENT_NODE && scope.host) {\n // Since a ShadowRoot is a special kind of DocumentFragment, it does not\n // have a root element to add a class to. So, we add this attribute to the\n // host element instead:\n scope.host.setAttribute('data-js-focus-visible', '');\n } else if (scope.nodeType === Node.DOCUMENT_NODE) {\n document.documentElement.classList.add('js-focus-visible');\n document.documentElement.setAttribute('data-js-focus-visible', '');\n }\n }\n\n // It is important to wrap all references to global window and document in\n // these checks to support server-side rendering use cases\n // @see https://github.com/WICG/focus-visible/issues/199\n if (typeof window !== 'undefined' && typeof document !== 'undefined') {\n // Make the polyfill helper globally available. This can be used as a signal\n // to interested libraries that wish to coordinate with the polyfill for e.g.,\n // applying the polyfill to a shadow root:\n window.applyFocusVisiblePolyfill = applyFocusVisiblePolyfill;\n\n // Notify interested libraries of the polyfill's presence, in case the\n // polyfill was loaded lazily:\n var event;\n\n try {\n event = new CustomEvent('focus-visible-polyfill-ready');\n } catch (error) {\n // IE11 does not support using CustomEvent as a constructor directly:\n event = document.createEvent('CustomEvent');\n event.initCustomEvent('focus-visible-polyfill-ready', false, false, {});\n }\n\n window.dispatchEvent(event);\n }\n\n if (typeof document !== 'undefined') {\n // Apply the polyfill to the global document, so that no JavaScript\n // coordination is required to use the polyfill in the top-level document:\n applyFocusVisiblePolyfill(document);\n }\n\n})));\n", "/*!\n * escape-html\n * Copyright(c) 2012-2013 TJ Holowaychuk\n * Copyright(c) 2015 Andreas Lubbe\n * Copyright(c) 2015 Tiancheng \"Timothy\" Gu\n * MIT Licensed\n */\n\n'use strict';\n\n/**\n * Module variables.\n * @private\n */\n\nvar matchHtmlRegExp = /[\"'&<>]/;\n\n/**\n * Module exports.\n * @public\n */\n\nmodule.exports = escapeHtml;\n\n/**\n * Escape special characters in the given string of html.\n *\n * @param {string} string The string to escape for inserting into HTML\n * @return {string}\n * @public\n */\n\nfunction escapeHtml(string) {\n var str = '' + string;\n var match = matchHtmlRegExp.exec(str);\n\n if (!match) {\n return str;\n }\n\n var escape;\n var html = '';\n var index = 0;\n var lastIndex = 0;\n\n for (index = match.index; index < str.length; index++) {\n switch (str.charCodeAt(index)) {\n case 34: // \"\n escape = '"';\n break;\n case 38: // &\n escape = '&';\n break;\n case 39: // '\n escape = ''';\n break;\n case 60: // <\n escape = '<';\n break;\n case 62: // >\n escape = '>';\n break;\n default:\n continue;\n }\n\n if (lastIndex !== index) {\n html += str.substring(lastIndex, index);\n }\n\n lastIndex = index + 1;\n html += escape;\n }\n\n return lastIndex !== index\n ? html + str.substring(lastIndex, index)\n : html;\n}\n", "/*!\n * clipboard.js v2.0.11\n * https://clipboardjs.com/\n *\n * Licensed MIT \u00A9 Zeno Rocha\n */\n(function webpackUniversalModuleDefinition(root, factory) {\n\tif(typeof exports === 'object' && typeof module === 'object')\n\t\tmodule.exports = factory();\n\telse if(typeof define === 'function' && define.amd)\n\t\tdefine([], factory);\n\telse if(typeof exports === 'object')\n\t\texports[\"ClipboardJS\"] = factory();\n\telse\n\t\troot[\"ClipboardJS\"] = factory();\n})(this, function() {\nreturn /******/ (function() { // webpackBootstrap\n/******/ \tvar __webpack_modules__ = ({\n\n/***/ 686:\n/***/ (function(__unused_webpack_module, __webpack_exports__, __webpack_require__) {\n\n\"use strict\";\n\n// EXPORTS\n__webpack_require__.d(__webpack_exports__, {\n \"default\": function() { return /* binding */ clipboard; }\n});\n\n// EXTERNAL MODULE: ./node_modules/tiny-emitter/index.js\nvar tiny_emitter = __webpack_require__(279);\nvar tiny_emitter_default = /*#__PURE__*/__webpack_require__.n(tiny_emitter);\n// EXTERNAL MODULE: ./node_modules/good-listener/src/listen.js\nvar listen = __webpack_require__(370);\nvar listen_default = /*#__PURE__*/__webpack_require__.n(listen);\n// EXTERNAL MODULE: ./node_modules/select/src/select.js\nvar src_select = __webpack_require__(817);\nvar select_default = /*#__PURE__*/__webpack_require__.n(src_select);\n;// CONCATENATED MODULE: ./src/common/command.js\n/**\n * Executes a given operation type.\n * @param {String} type\n * @return {Boolean}\n */\nfunction command(type) {\n try {\n return document.execCommand(type);\n } catch (err) {\n return false;\n }\n}\n;// CONCATENATED MODULE: ./src/actions/cut.js\n\n\n/**\n * Cut action wrapper.\n * @param {String|HTMLElement} target\n * @return {String}\n */\n\nvar ClipboardActionCut = function ClipboardActionCut(target) {\n var selectedText = select_default()(target);\n command('cut');\n return selectedText;\n};\n\n/* harmony default export */ var actions_cut = (ClipboardActionCut);\n;// CONCATENATED MODULE: ./src/common/create-fake-element.js\n/**\n * Creates a fake textarea element with a value.\n * @param {String} value\n * @return {HTMLElement}\n */\nfunction createFakeElement(value) {\n var isRTL = document.documentElement.getAttribute('dir') === 'rtl';\n var fakeElement = document.createElement('textarea'); // Prevent zooming on iOS\n\n fakeElement.style.fontSize = '12pt'; // Reset box model\n\n fakeElement.style.border = '0';\n fakeElement.style.padding = '0';\n fakeElement.style.margin = '0'; // Move element out of screen horizontally\n\n fakeElement.style.position = 'absolute';\n fakeElement.style[isRTL ? 'right' : 'left'] = '-9999px'; // Move element to the same position vertically\n\n var yPosition = window.pageYOffset || document.documentElement.scrollTop;\n fakeElement.style.top = \"\".concat(yPosition, \"px\");\n fakeElement.setAttribute('readonly', '');\n fakeElement.value = value;\n return fakeElement;\n}\n;// CONCATENATED MODULE: ./src/actions/copy.js\n\n\n\n/**\n * Create fake copy action wrapper using a fake element.\n * @param {String} target\n * @param {Object} options\n * @return {String}\n */\n\nvar fakeCopyAction = function fakeCopyAction(value, options) {\n var fakeElement = createFakeElement(value);\n options.container.appendChild(fakeElement);\n var selectedText = select_default()(fakeElement);\n command('copy');\n fakeElement.remove();\n return selectedText;\n};\n/**\n * Copy action wrapper.\n * @param {String|HTMLElement} target\n * @param {Object} options\n * @return {String}\n */\n\n\nvar ClipboardActionCopy = function ClipboardActionCopy(target) {\n var options = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {\n container: document.body\n };\n var selectedText = '';\n\n if (typeof target === 'string') {\n selectedText = fakeCopyAction(target, options);\n } else if (target instanceof HTMLInputElement && !['text', 'search', 'url', 'tel', 'password'].includes(target === null || target === void 0 ? void 0 : target.type)) {\n // If input type doesn't support `setSelectionRange`. Simulate it. https://developer.mozilla.org/en-US/docs/Web/API/HTMLInputElement/setSelectionRange\n selectedText = fakeCopyAction(target.value, options);\n } else {\n selectedText = select_default()(target);\n command('copy');\n }\n\n return selectedText;\n};\n\n/* harmony default export */ var actions_copy = (ClipboardActionCopy);\n;// CONCATENATED MODULE: ./src/actions/default.js\nfunction _typeof(obj) { \"@babel/helpers - typeof\"; if (typeof Symbol === \"function\" && typeof Symbol.iterator === \"symbol\") { _typeof = function _typeof(obj) { return typeof obj; }; } else { _typeof = function _typeof(obj) { return obj && typeof Symbol === \"function\" && obj.constructor === Symbol && obj !== Symbol.prototype ? \"symbol\" : typeof obj; }; } return _typeof(obj); }\n\n\n\n/**\n * Inner function which performs selection from either `text` or `target`\n * properties and then executes copy or cut operations.\n * @param {Object} options\n */\n\nvar ClipboardActionDefault = function ClipboardActionDefault() {\n var options = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : {};\n // Defines base properties passed from constructor.\n var _options$action = options.action,\n action = _options$action === void 0 ? 'copy' : _options$action,\n container = options.container,\n target = options.target,\n text = options.text; // Sets the `action` to be performed which can be either 'copy' or 'cut'.\n\n if (action !== 'copy' && action !== 'cut') {\n throw new Error('Invalid \"action\" value, use either \"copy\" or \"cut\"');\n } // Sets the `target` property using an element that will be have its content copied.\n\n\n if (target !== undefined) {\n if (target && _typeof(target) === 'object' && target.nodeType === 1) {\n if (action === 'copy' && target.hasAttribute('disabled')) {\n throw new Error('Invalid \"target\" attribute. Please use \"readonly\" instead of \"disabled\" attribute');\n }\n\n if (action === 'cut' && (target.hasAttribute('readonly') || target.hasAttribute('disabled'))) {\n throw new Error('Invalid \"target\" attribute. You can\\'t cut text from elements with \"readonly\" or \"disabled\" attributes');\n }\n } else {\n throw new Error('Invalid \"target\" value, use a valid Element');\n }\n } // Define selection strategy based on `text` property.\n\n\n if (text) {\n return actions_copy(text, {\n container: container\n });\n } // Defines which selection strategy based on `target` property.\n\n\n if (target) {\n return action === 'cut' ? actions_cut(target) : actions_copy(target, {\n container: container\n });\n }\n};\n\n/* harmony default export */ var actions_default = (ClipboardActionDefault);\n;// CONCATENATED MODULE: ./src/clipboard.js\nfunction clipboard_typeof(obj) { \"@babel/helpers - typeof\"; if (typeof Symbol === \"function\" && typeof Symbol.iterator === \"symbol\") { clipboard_typeof = function _typeof(obj) { return typeof obj; }; } else { clipboard_typeof = function _typeof(obj) { return obj && typeof Symbol === \"function\" && obj.constructor === Symbol && obj !== Symbol.prototype ? \"symbol\" : typeof obj; }; } return clipboard_typeof(obj); }\n\nfunction _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError(\"Cannot call a class as a function\"); } }\n\nfunction _defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if (\"value\" in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } }\n\nfunction _createClass(Constructor, protoProps, staticProps) { if (protoProps) _defineProperties(Constructor.prototype, protoProps); if (staticProps) _defineProperties(Constructor, staticProps); return Constructor; }\n\nfunction _inherits(subClass, superClass) { if (typeof superClass !== \"function\" && superClass !== null) { throw new TypeError(\"Super expression must either be null or a function\"); } subClass.prototype = Object.create(superClass && superClass.prototype, { constructor: { value: subClass, writable: true, configurable: true } }); if (superClass) _setPrototypeOf(subClass, superClass); }\n\nfunction _setPrototypeOf(o, p) { _setPrototypeOf = Object.setPrototypeOf || function _setPrototypeOf(o, p) { o.__proto__ = p; return o; }; return _setPrototypeOf(o, p); }\n\nfunction _createSuper(Derived) { var hasNativeReflectConstruct = _isNativeReflectConstruct(); return function _createSuperInternal() { var Super = _getPrototypeOf(Derived), result; if (hasNativeReflectConstruct) { var NewTarget = _getPrototypeOf(this).constructor; result = Reflect.construct(Super, arguments, NewTarget); } else { result = Super.apply(this, arguments); } return _possibleConstructorReturn(this, result); }; }\n\nfunction _possibleConstructorReturn(self, call) { if (call && (clipboard_typeof(call) === \"object\" || typeof call === \"function\")) { return call; } return _assertThisInitialized(self); }\n\nfunction _assertThisInitialized(self) { if (self === void 0) { throw new ReferenceError(\"this hasn't been initialised - super() hasn't been called\"); } return self; }\n\nfunction _isNativeReflectConstruct() { if (typeof Reflect === \"undefined\" || !Reflect.construct) return false; if (Reflect.construct.sham) return false; if (typeof Proxy === \"function\") return true; try { Date.prototype.toString.call(Reflect.construct(Date, [], function () {})); return true; } catch (e) { return false; } }\n\nfunction _getPrototypeOf(o) { _getPrototypeOf = Object.setPrototypeOf ? Object.getPrototypeOf : function _getPrototypeOf(o) { return o.__proto__ || Object.getPrototypeOf(o); }; return _getPrototypeOf(o); }\n\n\n\n\n\n\n/**\n * Helper function to retrieve attribute value.\n * @param {String} suffix\n * @param {Element} element\n */\n\nfunction getAttributeValue(suffix, element) {\n var attribute = \"data-clipboard-\".concat(suffix);\n\n if (!element.hasAttribute(attribute)) {\n return;\n }\n\n return element.getAttribute(attribute);\n}\n/**\n * Base class which takes one or more elements, adds event listeners to them,\n * and instantiates a new `ClipboardAction` on each click.\n */\n\n\nvar Clipboard = /*#__PURE__*/function (_Emitter) {\n _inherits(Clipboard, _Emitter);\n\n var _super = _createSuper(Clipboard);\n\n /**\n * @param {String|HTMLElement|HTMLCollection|NodeList} trigger\n * @param {Object} options\n */\n function Clipboard(trigger, options) {\n var _this;\n\n _classCallCheck(this, Clipboard);\n\n _this = _super.call(this);\n\n _this.resolveOptions(options);\n\n _this.listenClick(trigger);\n\n return _this;\n }\n /**\n * Defines if attributes would be resolved using internal setter functions\n * or custom functions that were passed in the constructor.\n * @param {Object} options\n */\n\n\n _createClass(Clipboard, [{\n key: \"resolveOptions\",\n value: function resolveOptions() {\n var options = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : {};\n this.action = typeof options.action === 'function' ? options.action : this.defaultAction;\n this.target = typeof options.target === 'function' ? options.target : this.defaultTarget;\n this.text = typeof options.text === 'function' ? options.text : this.defaultText;\n this.container = clipboard_typeof(options.container) === 'object' ? options.container : document.body;\n }\n /**\n * Adds a click event listener to the passed trigger.\n * @param {String|HTMLElement|HTMLCollection|NodeList} trigger\n */\n\n }, {\n key: \"listenClick\",\n value: function listenClick(trigger) {\n var _this2 = this;\n\n this.listener = listen_default()(trigger, 'click', function (e) {\n return _this2.onClick(e);\n });\n }\n /**\n * Defines a new `ClipboardAction` on each click event.\n * @param {Event} e\n */\n\n }, {\n key: \"onClick\",\n value: function onClick(e) {\n var trigger = e.delegateTarget || e.currentTarget;\n var action = this.action(trigger) || 'copy';\n var text = actions_default({\n action: action,\n container: this.container,\n target: this.target(trigger),\n text: this.text(trigger)\n }); // Fires an event based on the copy operation result.\n\n this.emit(text ? 'success' : 'error', {\n action: action,\n text: text,\n trigger: trigger,\n clearSelection: function clearSelection() {\n if (trigger) {\n trigger.focus();\n }\n\n window.getSelection().removeAllRanges();\n }\n });\n }\n /**\n * Default `action` lookup function.\n * @param {Element} trigger\n */\n\n }, {\n key: \"defaultAction\",\n value: function defaultAction(trigger) {\n return getAttributeValue('action', trigger);\n }\n /**\n * Default `target` lookup function.\n * @param {Element} trigger\n */\n\n }, {\n key: \"defaultTarget\",\n value: function defaultTarget(trigger) {\n var selector = getAttributeValue('target', trigger);\n\n if (selector) {\n return document.querySelector(selector);\n }\n }\n /**\n * Allow fire programmatically a copy action\n * @param {String|HTMLElement} target\n * @param {Object} options\n * @returns Text copied.\n */\n\n }, {\n key: \"defaultText\",\n\n /**\n * Default `text` lookup function.\n * @param {Element} trigger\n */\n value: function defaultText(trigger) {\n return getAttributeValue('text', trigger);\n }\n /**\n * Destroy lifecycle.\n */\n\n }, {\n key: \"destroy\",\n value: function destroy() {\n this.listener.destroy();\n }\n }], [{\n key: \"copy\",\n value: function copy(target) {\n var options = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {\n container: document.body\n };\n return actions_copy(target, options);\n }\n /**\n * Allow fire programmatically a cut action\n * @param {String|HTMLElement} target\n * @returns Text cutted.\n */\n\n }, {\n key: \"cut\",\n value: function cut(target) {\n return actions_cut(target);\n }\n /**\n * Returns the support of the given action, or all actions if no action is\n * given.\n * @param {String} [action]\n */\n\n }, {\n key: \"isSupported\",\n value: function isSupported() {\n var action = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : ['copy', 'cut'];\n var actions = typeof action === 'string' ? [action] : action;\n var support = !!document.queryCommandSupported;\n actions.forEach(function (action) {\n support = support && !!document.queryCommandSupported(action);\n });\n return support;\n }\n }]);\n\n return Clipboard;\n}((tiny_emitter_default()));\n\n/* harmony default export */ var clipboard = (Clipboard);\n\n/***/ }),\n\n/***/ 828:\n/***/ (function(module) {\n\nvar DOCUMENT_NODE_TYPE = 9;\n\n/**\n * A polyfill for Element.matches()\n */\nif (typeof Element !== 'undefined' && !Element.prototype.matches) {\n var proto = Element.prototype;\n\n proto.matches = proto.matchesSelector ||\n proto.mozMatchesSelector ||\n proto.msMatchesSelector ||\n proto.oMatchesSelector ||\n proto.webkitMatchesSelector;\n}\n\n/**\n * Finds the closest parent that matches a selector.\n *\n * @param {Element} element\n * @param {String} selector\n * @return {Function}\n */\nfunction closest (element, selector) {\n while (element && element.nodeType !== DOCUMENT_NODE_TYPE) {\n if (typeof element.matches === 'function' &&\n element.matches(selector)) {\n return element;\n }\n element = element.parentNode;\n }\n}\n\nmodule.exports = closest;\n\n\n/***/ }),\n\n/***/ 438:\n/***/ (function(module, __unused_webpack_exports, __webpack_require__) {\n\nvar closest = __webpack_require__(828);\n\n/**\n * Delegates event to a selector.\n *\n * @param {Element} element\n * @param {String} selector\n * @param {String} type\n * @param {Function} callback\n * @param {Boolean} useCapture\n * @return {Object}\n */\nfunction _delegate(element, selector, type, callback, useCapture) {\n var listenerFn = listener.apply(this, arguments);\n\n element.addEventListener(type, listenerFn, useCapture);\n\n return {\n destroy: function() {\n element.removeEventListener(type, listenerFn, useCapture);\n }\n }\n}\n\n/**\n * Delegates event to a selector.\n *\n * @param {Element|String|Array} [elements]\n * @param {String} selector\n * @param {String} type\n * @param {Function} callback\n * @param {Boolean} useCapture\n * @return {Object}\n */\nfunction delegate(elements, selector, type, callback, useCapture) {\n // Handle the regular Element usage\n if (typeof elements.addEventListener === 'function') {\n return _delegate.apply(null, arguments);\n }\n\n // Handle Element-less usage, it defaults to global delegation\n if (typeof type === 'function') {\n // Use `document` as the first parameter, then apply arguments\n // This is a short way to .unshift `arguments` without running into deoptimizations\n return _delegate.bind(null, document).apply(null, arguments);\n }\n\n // Handle Selector-based usage\n if (typeof elements === 'string') {\n elements = document.querySelectorAll(elements);\n }\n\n // Handle Array-like based usage\n return Array.prototype.map.call(elements, function (element) {\n return _delegate(element, selector, type, callback, useCapture);\n });\n}\n\n/**\n * Finds closest match and invokes callback.\n *\n * @param {Element} element\n * @param {String} selector\n * @param {String} type\n * @param {Function} callback\n * @return {Function}\n */\nfunction listener(element, selector, type, callback) {\n return function(e) {\n e.delegateTarget = closest(e.target, selector);\n\n if (e.delegateTarget) {\n callback.call(element, e);\n }\n }\n}\n\nmodule.exports = delegate;\n\n\n/***/ }),\n\n/***/ 879:\n/***/ (function(__unused_webpack_module, exports) {\n\n/**\n * Check if argument is a HTML element.\n *\n * @param {Object} value\n * @return {Boolean}\n */\nexports.node = function(value) {\n return value !== undefined\n && value instanceof HTMLElement\n && value.nodeType === 1;\n};\n\n/**\n * Check if argument is a list of HTML elements.\n *\n * @param {Object} value\n * @return {Boolean}\n */\nexports.nodeList = function(value) {\n var type = Object.prototype.toString.call(value);\n\n return value !== undefined\n && (type === '[object NodeList]' || type === '[object HTMLCollection]')\n && ('length' in value)\n && (value.length === 0 || exports.node(value[0]));\n};\n\n/**\n * Check if argument is a string.\n *\n * @param {Object} value\n * @return {Boolean}\n */\nexports.string = function(value) {\n return typeof value === 'string'\n || value instanceof String;\n};\n\n/**\n * Check if argument is a function.\n *\n * @param {Object} value\n * @return {Boolean}\n */\nexports.fn = function(value) {\n var type = Object.prototype.toString.call(value);\n\n return type === '[object Function]';\n};\n\n\n/***/ }),\n\n/***/ 370:\n/***/ (function(module, __unused_webpack_exports, __webpack_require__) {\n\nvar is = __webpack_require__(879);\nvar delegate = __webpack_require__(438);\n\n/**\n * Validates all params and calls the right\n * listener function based on its target type.\n *\n * @param {String|HTMLElement|HTMLCollection|NodeList} target\n * @param {String} type\n * @param {Function} callback\n * @return {Object}\n */\nfunction listen(target, type, callback) {\n if (!target && !type && !callback) {\n throw new Error('Missing required arguments');\n }\n\n if (!is.string(type)) {\n throw new TypeError('Second argument must be a String');\n }\n\n if (!is.fn(callback)) {\n throw new TypeError('Third argument must be a Function');\n }\n\n if (is.node(target)) {\n return listenNode(target, type, callback);\n }\n else if (is.nodeList(target)) {\n return listenNodeList(target, type, callback);\n }\n else if (is.string(target)) {\n return listenSelector(target, type, callback);\n }\n else {\n throw new TypeError('First argument must be a String, HTMLElement, HTMLCollection, or NodeList');\n }\n}\n\n/**\n * Adds an event listener to a HTML element\n * and returns a remove listener function.\n *\n * @param {HTMLElement} node\n * @param {String} type\n * @param {Function} callback\n * @return {Object}\n */\nfunction listenNode(node, type, callback) {\n node.addEventListener(type, callback);\n\n return {\n destroy: function() {\n node.removeEventListener(type, callback);\n }\n }\n}\n\n/**\n * Add an event listener to a list of HTML elements\n * and returns a remove listener function.\n *\n * @param {NodeList|HTMLCollection} nodeList\n * @param {String} type\n * @param {Function} callback\n * @return {Object}\n */\nfunction listenNodeList(nodeList, type, callback) {\n Array.prototype.forEach.call(nodeList, function(node) {\n node.addEventListener(type, callback);\n });\n\n return {\n destroy: function() {\n Array.prototype.forEach.call(nodeList, function(node) {\n node.removeEventListener(type, callback);\n });\n }\n }\n}\n\n/**\n * Add an event listener to a selector\n * and returns a remove listener function.\n *\n * @param {String} selector\n * @param {String} type\n * @param {Function} callback\n * @return {Object}\n */\nfunction listenSelector(selector, type, callback) {\n return delegate(document.body, selector, type, callback);\n}\n\nmodule.exports = listen;\n\n\n/***/ }),\n\n/***/ 817:\n/***/ (function(module) {\n\nfunction select(element) {\n var selectedText;\n\n if (element.nodeName === 'SELECT') {\n element.focus();\n\n selectedText = element.value;\n }\n else if (element.nodeName === 'INPUT' || element.nodeName === 'TEXTAREA') {\n var isReadOnly = element.hasAttribute('readonly');\n\n if (!isReadOnly) {\n element.setAttribute('readonly', '');\n }\n\n element.select();\n element.setSelectionRange(0, element.value.length);\n\n if (!isReadOnly) {\n element.removeAttribute('readonly');\n }\n\n selectedText = element.value;\n }\n else {\n if (element.hasAttribute('contenteditable')) {\n element.focus();\n }\n\n var selection = window.getSelection();\n var range = document.createRange();\n\n range.selectNodeContents(element);\n selection.removeAllRanges();\n selection.addRange(range);\n\n selectedText = selection.toString();\n }\n\n return selectedText;\n}\n\nmodule.exports = select;\n\n\n/***/ }),\n\n/***/ 279:\n/***/ (function(module) {\n\nfunction E () {\n // Keep this empty so it's easier to inherit from\n // (via https://github.com/lipsmack from https://github.com/scottcorgan/tiny-emitter/issues/3)\n}\n\nE.prototype = {\n on: function (name, callback, ctx) {\n var e = this.e || (this.e = {});\n\n (e[name] || (e[name] = [])).push({\n fn: callback,\n ctx: ctx\n });\n\n return this;\n },\n\n once: function (name, callback, ctx) {\n var self = this;\n function listener () {\n self.off(name, listener);\n callback.apply(ctx, arguments);\n };\n\n listener._ = callback\n return this.on(name, listener, ctx);\n },\n\n emit: function (name) {\n var data = [].slice.call(arguments, 1);\n var evtArr = ((this.e || (this.e = {}))[name] || []).slice();\n var i = 0;\n var len = evtArr.length;\n\n for (i; i < len; i++) {\n evtArr[i].fn.apply(evtArr[i].ctx, data);\n }\n\n return this;\n },\n\n off: function (name, callback) {\n var e = this.e || (this.e = {});\n var evts = e[name];\n var liveEvents = [];\n\n if (evts && callback) {\n for (var i = 0, len = evts.length; i < len; i++) {\n if (evts[i].fn !== callback && evts[i].fn._ !== callback)\n liveEvents.push(evts[i]);\n }\n }\n\n // Remove event from queue to prevent memory leak\n // Suggested by https://github.com/lazd\n // Ref: https://github.com/scottcorgan/tiny-emitter/commit/c6ebfaa9bc973b33d110a84a307742b7cf94c953#commitcomment-5024910\n\n (liveEvents.length)\n ? e[name] = liveEvents\n : delete e[name];\n\n return this;\n }\n};\n\nmodule.exports = E;\nmodule.exports.TinyEmitter = E;\n\n\n/***/ })\n\n/******/ \t});\n/************************************************************************/\n/******/ \t// The module cache\n/******/ \tvar __webpack_module_cache__ = {};\n/******/ \t\n/******/ \t// The require function\n/******/ \tfunction __webpack_require__(moduleId) {\n/******/ \t\t// Check if module is in cache\n/******/ \t\tif(__webpack_module_cache__[moduleId]) {\n/******/ \t\t\treturn __webpack_module_cache__[moduleId].exports;\n/******/ \t\t}\n/******/ \t\t// Create a new module (and put it into the cache)\n/******/ \t\tvar module = __webpack_module_cache__[moduleId] = {\n/******/ \t\t\t// no module.id needed\n/******/ \t\t\t// no module.loaded needed\n/******/ \t\t\texports: {}\n/******/ \t\t};\n/******/ \t\n/******/ \t\t// Execute the module function\n/******/ \t\t__webpack_modules__[moduleId](module, module.exports, __webpack_require__);\n/******/ \t\n/******/ \t\t// Return the exports of the module\n/******/ \t\treturn module.exports;\n/******/ \t}\n/******/ \t\n/************************************************************************/\n/******/ \t/* webpack/runtime/compat get default export */\n/******/ \t!function() {\n/******/ \t\t// getDefaultExport function for compatibility with non-harmony modules\n/******/ \t\t__webpack_require__.n = function(module) {\n/******/ \t\t\tvar getter = module && module.__esModule ?\n/******/ \t\t\t\tfunction() { return module['default']; } :\n/******/ \t\t\t\tfunction() { return module; };\n/******/ \t\t\t__webpack_require__.d(getter, { a: getter });\n/******/ \t\t\treturn getter;\n/******/ \t\t};\n/******/ \t}();\n/******/ \t\n/******/ \t/* webpack/runtime/define property getters */\n/******/ \t!function() {\n/******/ \t\t// define getter functions for harmony exports\n/******/ \t\t__webpack_require__.d = function(exports, definition) {\n/******/ \t\t\tfor(var key in definition) {\n/******/ \t\t\t\tif(__webpack_require__.o(definition, key) && !__webpack_require__.o(exports, key)) {\n/******/ \t\t\t\t\tObject.defineProperty(exports, key, { enumerable: true, get: definition[key] });\n/******/ \t\t\t\t}\n/******/ \t\t\t}\n/******/ \t\t};\n/******/ \t}();\n/******/ \t\n/******/ \t/* webpack/runtime/hasOwnProperty shorthand */\n/******/ \t!function() {\n/******/ \t\t__webpack_require__.o = function(obj, prop) { return Object.prototype.hasOwnProperty.call(obj, prop); }\n/******/ \t}();\n/******/ \t\n/************************************************************************/\n/******/ \t// module exports must be returned from runtime so entry inlining is disabled\n/******/ \t// startup\n/******/ \t// Load entry module and return exports\n/******/ \treturn __webpack_require__(686);\n/******/ })()\n.default;\n});", "/*\n * Copyright (c) 2016-2024 Martin Donath \n *\n * Permission is hereby granted, free of charge, to any person obtaining a copy\n * of this software and associated documentation files (the \"Software\"), to\n * deal in the Software without restriction, including without limitation the\n * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or\n * sell copies of the Software, and to permit persons to whom the Software is\n * furnished to do so, subject to the following conditions:\n *\n * The above copyright notice and this permission notice shall be included in\n * all copies or substantial portions of the Software.\n *\n * THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n * FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. IN NO EVENT SHALL THE\n * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\n * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\n * IN THE SOFTWARE.\n */\n\nimport \"focus-visible\"\n\nimport {\n EMPTY,\n NEVER,\n Observable,\n Subject,\n defer,\n delay,\n filter,\n map,\n merge,\n mergeWith,\n shareReplay,\n switchMap\n} from \"rxjs\"\n\nimport { configuration, feature } from \"./_\"\nimport {\n at,\n getActiveElement,\n getOptionalElement,\n requestJSON,\n setLocation,\n setToggle,\n watchDocument,\n watchKeyboard,\n watchLocation,\n watchLocationTarget,\n watchMedia,\n watchPrint,\n watchScript,\n watchViewport\n} from \"./browser\"\nimport {\n getComponentElement,\n getComponentElements,\n mountAnnounce,\n mountBackToTop,\n mountConsent,\n mountContent,\n mountDialog,\n mountHeader,\n mountHeaderTitle,\n mountPalette,\n mountProgress,\n mountSearch,\n mountSearchHiglight,\n mountSidebar,\n mountSource,\n mountTableOfContents,\n mountTabs,\n watchHeader,\n watchMain\n} from \"./components\"\nimport {\n SearchIndex,\n setupClipboardJS,\n setupInstantNavigation,\n setupVersionSelector\n} from \"./integrations\"\nimport {\n patchEllipsis,\n patchIndeterminate,\n patchScrollfix,\n patchScrolllock\n} from \"./patches\"\nimport \"./polyfills\"\n\n/* ----------------------------------------------------------------------------\n * Functions - @todo refactor\n * ------------------------------------------------------------------------- */\n\n/**\n * Fetch search index\n *\n * @returns Search index observable\n */\nfunction fetchSearchIndex(): Observable {\n if (location.protocol === \"file:\") {\n return watchScript(\n `${new URL(\"search/search_index.js\", config.base)}`\n )\n .pipe(\n // @ts-ignore - @todo fix typings\n map(() => __index),\n shareReplay(1)\n )\n } else {\n return requestJSON(\n new URL(\"search/search_index.json\", config.base)\n )\n }\n}\n\n/* ----------------------------------------------------------------------------\n * Application\n * ------------------------------------------------------------------------- */\n\n/* Yay, JavaScript is available */\ndocument.documentElement.classList.remove(\"no-js\")\ndocument.documentElement.classList.add(\"js\")\n\n/* Set up navigation observables and subjects */\nconst document$ = watchDocument()\nconst location$ = watchLocation()\nconst target$ = watchLocationTarget(location$)\nconst keyboard$ = watchKeyboard()\n\n/* Set up media observables */\nconst viewport$ = watchViewport()\nconst tablet$ = watchMedia(\"(min-width: 960px)\")\nconst screen$ = watchMedia(\"(min-width: 1220px)\")\nconst print$ = watchPrint()\n\n/* Retrieve search index, if search is enabled */\nconst config = configuration()\nconst index$ = document.forms.namedItem(\"search\")\n ? fetchSearchIndex()\n : NEVER\n\n/* Set up Clipboard.js integration */\nconst alert$ = new Subject()\nsetupClipboardJS({ alert$ })\n\n/* Set up progress indicator */\nconst progress$ = new Subject()\n\n/* Set up instant navigation, if enabled */\nif (feature(\"navigation.instant\"))\n setupInstantNavigation({ location$, viewport$, progress$ })\n .subscribe(document$)\n\n/* Set up version selector */\nif (config.version?.provider === \"mike\")\n setupVersionSelector({ document$ })\n\n/* Always close drawer and search on navigation */\nmerge(location$, target$)\n .pipe(\n delay(125)\n )\n .subscribe(() => {\n setToggle(\"drawer\", false)\n setToggle(\"search\", false)\n })\n\n/* Set up global keyboard handlers */\nkeyboard$\n .pipe(\n filter(({ mode }) => mode === \"global\")\n )\n .subscribe(key => {\n switch (key.type) {\n\n /* Go to previous page */\n case \"p\":\n case \",\":\n const prev = getOptionalElement(\"link[rel=prev]\")\n if (typeof prev !== \"undefined\")\n setLocation(prev)\n break\n\n /* Go to next page */\n case \"n\":\n case \".\":\n const next = getOptionalElement(\"link[rel=next]\")\n if (typeof next !== \"undefined\")\n setLocation(next)\n break\n\n /* Expand navigation, see https://bit.ly/3ZjG5io */\n case \"Enter\":\n const active = getActiveElement()\n if (active instanceof HTMLLabelElement)\n active.click()\n }\n })\n\n/* Set up patches */\npatchEllipsis({ viewport$, document$ })\npatchIndeterminate({ document$, tablet$ })\npatchScrollfix({ document$ })\npatchScrolllock({ viewport$, tablet$ })\n\n/* Set up header and main area observable */\nconst header$ = watchHeader(getComponentElement(\"header\"), { viewport$ })\nconst main$ = document$\n .pipe(\n map(() => getComponentElement(\"main\")),\n switchMap(el => watchMain(el, { viewport$, header$ })),\n shareReplay(1)\n )\n\n/* Set up control component observables */\nconst control$ = merge(\n\n /* Consent */\n ...getComponentElements(\"consent\")\n .map(el => mountConsent(el, { target$ })),\n\n /* Dialog */\n ...getComponentElements(\"dialog\")\n .map(el => mountDialog(el, { alert$ })),\n\n /* Header */\n ...getComponentElements(\"header\")\n .map(el => mountHeader(el, { viewport$, header$, main$ })),\n\n /* Color palette */\n ...getComponentElements(\"palette\")\n .map(el => mountPalette(el)),\n\n /* Progress bar */\n ...getComponentElements(\"progress\")\n .map(el => mountProgress(el, { progress$ })),\n\n /* Search */\n ...getComponentElements(\"search\")\n .map(el => mountSearch(el, { index$, keyboard$ })),\n\n /* Repository information */\n ...getComponentElements(\"source\")\n .map(el => mountSource(el))\n)\n\n/* Set up content component observables */\nconst content$ = defer(() => merge(\n\n /* Announcement bar */\n ...getComponentElements(\"announce\")\n .map(el => mountAnnounce(el)),\n\n /* Content */\n ...getComponentElements(\"content\")\n .map(el => mountContent(el, { viewport$, target$, print$ })),\n\n /* Search highlighting */\n ...getComponentElements(\"content\")\n .map(el => feature(\"search.highlight\")\n ? mountSearchHiglight(el, { index$, location$ })\n : EMPTY\n ),\n\n /* Header title */\n ...getComponentElements(\"header-title\")\n .map(el => mountHeaderTitle(el, { viewport$, header$ })),\n\n /* Sidebar */\n ...getComponentElements(\"sidebar\")\n .map(el => el.getAttribute(\"data-md-type\") === \"navigation\"\n ? at(screen$, () => mountSidebar(el, { viewport$, header$, main$ }))\n : at(tablet$, () => mountSidebar(el, { viewport$, header$, main$ }))\n ),\n\n /* Navigation tabs */\n ...getComponentElements(\"tabs\")\n .map(el => mountTabs(el, { viewport$, header$ })),\n\n /* Table of contents */\n ...getComponentElements(\"toc\")\n .map(el => mountTableOfContents(el, {\n viewport$, header$, main$, target$\n })),\n\n /* Back-to-top button */\n ...getComponentElements(\"top\")\n .map(el => mountBackToTop(el, { viewport$, header$, main$, target$ }))\n))\n\n/* Set up component observables */\nconst component$ = document$\n .pipe(\n switchMap(() => content$),\n mergeWith(control$),\n shareReplay(1)\n )\n\n/* Subscribe to all components */\ncomponent$.subscribe()\n\n/* ----------------------------------------------------------------------------\n * Exports\n * ------------------------------------------------------------------------- */\n\nwindow.document$ = document$ /* Document observable */\nwindow.location$ = location$ /* Location subject */\nwindow.target$ = target$ /* Location target observable */\nwindow.keyboard$ = keyboard$ /* Keyboard observable */\nwindow.viewport$ = viewport$ /* Viewport observable */\nwindow.tablet$ = tablet$ /* Media tablet observable */\nwindow.screen$ = screen$ /* Media screen observable */\nwindow.print$ = print$ /* Media print observable */\nwindow.alert$ = alert$ /* Alert subject */\nwindow.progress$ = progress$ /* Progress indicator subject */\nwindow.component$ = component$ /* Component observable */\n", "/******************************************************************************\nCopyright (c) Microsoft Corporation.\n\nPermission to use, copy, modify, and/or distribute this software for any\npurpose with or without fee is hereby granted.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH\nREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY\nAND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,\nINDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM\nLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR\nOTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR\nPERFORMANCE OF THIS SOFTWARE.\n***************************************************************************** */\n/* global Reflect, Promise, SuppressedError, Symbol, Iterator */\n\nvar extendStatics = function(d, b) {\n extendStatics = Object.setPrototypeOf ||\n ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||\n function (d, b) { for (var p in b) if (Object.prototype.hasOwnProperty.call(b, p)) d[p] = b[p]; };\n return extendStatics(d, b);\n};\n\nexport function __extends(d, b) {\n if (typeof b !== \"function\" && b !== null)\n throw new TypeError(\"Class extends value \" + String(b) + \" is not a constructor or null\");\n extendStatics(d, b);\n function __() { this.constructor = d; }\n d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());\n}\n\nexport var __assign = function() {\n __assign = Object.assign || function __assign(t) {\n for (var s, i = 1, n = arguments.length; i < n; i++) {\n s = arguments[i];\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p)) t[p] = s[p];\n }\n return t;\n }\n return __assign.apply(this, arguments);\n}\n\nexport function __rest(s, e) {\n var t = {};\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)\n t[p] = s[p];\n if (s != null && typeof Object.getOwnPropertySymbols === \"function\")\n for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {\n if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))\n t[p[i]] = s[p[i]];\n }\n return t;\n}\n\nexport function __decorate(decorators, target, key, desc) {\n var c = arguments.length, r = c < 3 ? target : desc === null ? desc = Object.getOwnPropertyDescriptor(target, key) : desc, d;\n if (typeof Reflect === \"object\" && typeof Reflect.decorate === \"function\") r = Reflect.decorate(decorators, target, key, desc);\n else for (var i = decorators.length - 1; i >= 0; i--) if (d = decorators[i]) r = (c < 3 ? d(r) : c > 3 ? d(target, key, r) : d(target, key)) || r;\n return c > 3 && r && Object.defineProperty(target, key, r), r;\n}\n\nexport function __param(paramIndex, decorator) {\n return function (target, key) { decorator(target, key, paramIndex); }\n}\n\nexport function __esDecorate(ctor, descriptorIn, decorators, contextIn, initializers, extraInitializers) {\n function accept(f) { if (f !== void 0 && typeof f !== \"function\") throw new TypeError(\"Function expected\"); return f; }\n var kind = contextIn.kind, key = kind === \"getter\" ? \"get\" : kind === \"setter\" ? \"set\" : \"value\";\n var target = !descriptorIn && ctor ? contextIn[\"static\"] ? ctor : ctor.prototype : null;\n var descriptor = descriptorIn || (target ? Object.getOwnPropertyDescriptor(target, contextIn.name) : {});\n var _, done = false;\n for (var i = decorators.length - 1; i >= 0; i--) {\n var context = {};\n for (var p in contextIn) context[p] = p === \"access\" ? {} : contextIn[p];\n for (var p in contextIn.access) context.access[p] = contextIn.access[p];\n context.addInitializer = function (f) { if (done) throw new TypeError(\"Cannot add initializers after decoration has completed\"); extraInitializers.push(accept(f || null)); };\n var result = (0, decorators[i])(kind === \"accessor\" ? { get: descriptor.get, set: descriptor.set } : descriptor[key], context);\n if (kind === \"accessor\") {\n if (result === void 0) continue;\n if (result === null || typeof result !== \"object\") throw new TypeError(\"Object expected\");\n if (_ = accept(result.get)) descriptor.get = _;\n if (_ = accept(result.set)) descriptor.set = _;\n if (_ = accept(result.init)) initializers.unshift(_);\n }\n else if (_ = accept(result)) {\n if (kind === \"field\") initializers.unshift(_);\n else descriptor[key] = _;\n }\n }\n if (target) Object.defineProperty(target, contextIn.name, descriptor);\n done = true;\n};\n\nexport function __runInitializers(thisArg, initializers, value) {\n var useValue = arguments.length > 2;\n for (var i = 0; i < initializers.length; i++) {\n value = useValue ? initializers[i].call(thisArg, value) : initializers[i].call(thisArg);\n }\n return useValue ? value : void 0;\n};\n\nexport function __propKey(x) {\n return typeof x === \"symbol\" ? x : \"\".concat(x);\n};\n\nexport function __setFunctionName(f, name, prefix) {\n if (typeof name === \"symbol\") name = name.description ? \"[\".concat(name.description, \"]\") : \"\";\n return Object.defineProperty(f, \"name\", { configurable: true, value: prefix ? \"\".concat(prefix, \" \", name) : name });\n};\n\nexport function __metadata(metadataKey, metadataValue) {\n if (typeof Reflect === \"object\" && typeof Reflect.metadata === \"function\") return Reflect.metadata(metadataKey, metadataValue);\n}\n\nexport function __awaiter(thisArg, _arguments, P, generator) {\n function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\n return new (P || (P = Promise))(function (resolve, reject) {\n function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\n function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\n function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\n step((generator = generator.apply(thisArg, _arguments || [])).next());\n });\n}\n\nexport function __generator(thisArg, body) {\n var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g = Object.create((typeof Iterator === \"function\" ? Iterator : Object).prototype);\n return g.next = verb(0), g[\"throw\"] = verb(1), g[\"return\"] = verb(2), typeof Symbol === \"function\" && (g[Symbol.iterator] = function() { return this; }), g;\n function verb(n) { return function (v) { return step([n, v]); }; }\n function step(op) {\n if (f) throw new TypeError(\"Generator is already executing.\");\n while (g && (g = 0, op[0] && (_ = 0)), _) try {\n if (f = 1, y && (t = op[0] & 2 ? y[\"return\"] : op[0] ? y[\"throw\"] || ((t = y[\"return\"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t;\n if (y = 0, t) op = [op[0] & 2, t.value];\n switch (op[0]) {\n case 0: case 1: t = op; break;\n case 4: _.label++; return { value: op[1], done: false };\n case 5: _.label++; y = op[1]; op = [0]; continue;\n case 7: op = _.ops.pop(); _.trys.pop(); continue;\n default:\n if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }\n if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }\n if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }\n if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }\n if (t[2]) _.ops.pop();\n _.trys.pop(); continue;\n }\n op = body.call(thisArg, _);\n } catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }\n if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };\n }\n}\n\nexport var __createBinding = Object.create ? (function(o, m, k, k2) {\n if (k2 === undefined) k2 = k;\n var desc = Object.getOwnPropertyDescriptor(m, k);\n if (!desc || (\"get\" in desc ? !m.__esModule : desc.writable || desc.configurable)) {\n desc = { enumerable: true, get: function() { return m[k]; } };\n }\n Object.defineProperty(o, k2, desc);\n}) : (function(o, m, k, k2) {\n if (k2 === undefined) k2 = k;\n o[k2] = m[k];\n});\n\nexport function __exportStar(m, o) {\n for (var p in m) if (p !== \"default\" && !Object.prototype.hasOwnProperty.call(o, p)) __createBinding(o, m, p);\n}\n\nexport function __values(o) {\n var s = typeof Symbol === \"function\" && Symbol.iterator, m = s && o[s], i = 0;\n if (m) return m.call(o);\n if (o && typeof o.length === \"number\") return {\n next: function () {\n if (o && i >= o.length) o = void 0;\n return { value: o && o[i++], done: !o };\n }\n };\n throw new TypeError(s ? \"Object is not iterable.\" : \"Symbol.iterator is not defined.\");\n}\n\nexport function __read(o, n) {\n var m = typeof Symbol === \"function\" && o[Symbol.iterator];\n if (!m) return o;\n var i = m.call(o), r, ar = [], e;\n try {\n while ((n === void 0 || n-- > 0) && !(r = i.next()).done) ar.push(r.value);\n }\n catch (error) { e = { error: error }; }\n finally {\n try {\n if (r && !r.done && (m = i[\"return\"])) m.call(i);\n }\n finally { if (e) throw e.error; }\n }\n return ar;\n}\n\n/** @deprecated */\nexport function __spread() {\n for (var ar = [], i = 0; i < arguments.length; i++)\n ar = ar.concat(__read(arguments[i]));\n return ar;\n}\n\n/** @deprecated */\nexport function __spreadArrays() {\n for (var s = 0, i = 0, il = arguments.length; i < il; i++) s += arguments[i].length;\n for (var r = Array(s), k = 0, i = 0; i < il; i++)\n for (var a = arguments[i], j = 0, jl = a.length; j < jl; j++, k++)\n r[k] = a[j];\n return r;\n}\n\nexport function __spreadArray(to, from, pack) {\n if (pack || arguments.length === 2) for (var i = 0, l = from.length, ar; i < l; i++) {\n if (ar || !(i in from)) {\n if (!ar) ar = Array.prototype.slice.call(from, 0, i);\n ar[i] = from[i];\n }\n }\n return to.concat(ar || Array.prototype.slice.call(from));\n}\n\nexport function __await(v) {\n return this instanceof __await ? (this.v = v, this) : new __await(v);\n}\n\nexport function __asyncGenerator(thisArg, _arguments, generator) {\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\n var g = generator.apply(thisArg, _arguments || []), i, q = [];\n return i = Object.create((typeof AsyncIterator === \"function\" ? AsyncIterator : Object).prototype), verb(\"next\"), verb(\"throw\"), verb(\"return\", awaitReturn), i[Symbol.asyncIterator] = function () { return this; }, i;\n function awaitReturn(f) { return function (v) { return Promise.resolve(v).then(f, reject); }; }\n function verb(n, f) { if (g[n]) { i[n] = function (v) { return new Promise(function (a, b) { q.push([n, v, a, b]) > 1 || resume(n, v); }); }; if (f) i[n] = f(i[n]); } }\n function resume(n, v) { try { step(g[n](v)); } catch (e) { settle(q[0][3], e); } }\n function step(r) { r.value instanceof __await ? Promise.resolve(r.value.v).then(fulfill, reject) : settle(q[0][2], r); }\n function fulfill(value) { resume(\"next\", value); }\n function reject(value) { resume(\"throw\", value); }\n function settle(f, v) { if (f(v), q.shift(), q.length) resume(q[0][0], q[0][1]); }\n}\n\nexport function __asyncDelegator(o) {\n var i, p;\n return i = {}, verb(\"next\"), verb(\"throw\", function (e) { throw e; }), verb(\"return\"), i[Symbol.iterator] = function () { return this; }, i;\n function verb(n, f) { i[n] = o[n] ? function (v) { return (p = !p) ? { value: __await(o[n](v)), done: false } : f ? f(v) : v; } : f; }\n}\n\nexport function __asyncValues(o) {\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\n var m = o[Symbol.asyncIterator], i;\n return m ? m.call(o) : (o = typeof __values === \"function\" ? __values(o) : o[Symbol.iterator](), i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i);\n function verb(n) { i[n] = o[n] && function (v) { return new Promise(function (resolve, reject) { v = o[n](v), settle(resolve, reject, v.done, v.value); }); }; }\n function settle(resolve, reject, d, v) { Promise.resolve(v).then(function(v) { resolve({ value: v, done: d }); }, reject); }\n}\n\nexport function __makeTemplateObject(cooked, raw) {\n if (Object.defineProperty) { Object.defineProperty(cooked, \"raw\", { value: raw }); } else { cooked.raw = raw; }\n return cooked;\n};\n\nvar __setModuleDefault = Object.create ? (function(o, v) {\n Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n o[\"default\"] = v;\n};\n\nexport function __importStar(mod) {\n if (mod && mod.__esModule) return mod;\n var result = {};\n if (mod != null) for (var k in mod) if (k !== \"default\" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n __setModuleDefault(result, mod);\n return result;\n}\n\nexport function __importDefault(mod) {\n return (mod && mod.__esModule) ? mod : { default: mod };\n}\n\nexport function __classPrivateFieldGet(receiver, state, kind, f) {\n if (kind === \"a\" && !f) throw new TypeError(\"Private accessor was defined without a getter\");\n if (typeof state === \"function\" ? receiver !== state || !f : !state.has(receiver)) throw new TypeError(\"Cannot read private member from an object whose class did not declare it\");\n return kind === \"m\" ? f : kind === \"a\" ? f.call(receiver) : f ? f.value : state.get(receiver);\n}\n\nexport function __classPrivateFieldSet(receiver, state, value, kind, f) {\n if (kind === \"m\") throw new TypeError(\"Private method is not writable\");\n if (kind === \"a\" && !f) throw new TypeError(\"Private accessor was defined without a setter\");\n if (typeof state === \"function\" ? receiver !== state || !f : !state.has(receiver)) throw new TypeError(\"Cannot write private member to an object whose class did not declare it\");\n return (kind === \"a\" ? f.call(receiver, value) : f ? f.value = value : state.set(receiver, value)), value;\n}\n\nexport function __classPrivateFieldIn(state, receiver) {\n if (receiver === null || (typeof receiver !== \"object\" && typeof receiver !== \"function\")) throw new TypeError(\"Cannot use 'in' operator on non-object\");\n return typeof state === \"function\" ? receiver === state : state.has(receiver);\n}\n\nexport function __addDisposableResource(env, value, async) {\n if (value !== null && value !== void 0) {\n if (typeof value !== \"object\" && typeof value !== \"function\") throw new TypeError(\"Object expected.\");\n var dispose, inner;\n if (async) {\n if (!Symbol.asyncDispose) throw new TypeError(\"Symbol.asyncDispose is not defined.\");\n dispose = value[Symbol.asyncDispose];\n }\n if (dispose === void 0) {\n if (!Symbol.dispose) throw new TypeError(\"Symbol.dispose is not defined.\");\n dispose = value[Symbol.dispose];\n if (async) inner = dispose;\n }\n if (typeof dispose !== \"function\") throw new TypeError(\"Object not disposable.\");\n if (inner) dispose = function() { try { inner.call(this); } catch (e) { return Promise.reject(e); } };\n env.stack.push({ value: value, dispose: dispose, async: async });\n }\n else if (async) {\n env.stack.push({ async: true });\n }\n return value;\n}\n\nvar _SuppressedError = typeof SuppressedError === \"function\" ? SuppressedError : function (error, suppressed, message) {\n var e = new Error(message);\n return e.name = \"SuppressedError\", e.error = error, e.suppressed = suppressed, e;\n};\n\nexport function __disposeResources(env) {\n function fail(e) {\n env.error = env.hasError ? new _SuppressedError(e, env.error, \"An error was suppressed during disposal.\") : e;\n env.hasError = true;\n }\n var r, s = 0;\n function next() {\n while (r = env.stack.pop()) {\n try {\n if (!r.async && s === 1) return s = 0, env.stack.push(r), Promise.resolve().then(next);\n if (r.dispose) {\n var result = r.dispose.call(r.value);\n if (r.async) return s |= 2, Promise.resolve(result).then(next, function(e) { fail(e); return next(); });\n }\n else s |= 1;\n }\n catch (e) {\n fail(e);\n }\n }\n if (s === 1) return env.hasError ? Promise.reject(env.error) : Promise.resolve();\n if (env.hasError) throw env.error;\n }\n return next();\n}\n\nexport default {\n __extends,\n __assign,\n __rest,\n __decorate,\n __param,\n __metadata,\n __awaiter,\n __generator,\n __createBinding,\n __exportStar,\n __values,\n __read,\n __spread,\n __spreadArrays,\n __spreadArray,\n __await,\n __asyncGenerator,\n __asyncDelegator,\n __asyncValues,\n __makeTemplateObject,\n __importStar,\n __importDefault,\n __classPrivateFieldGet,\n __classPrivateFieldSet,\n __classPrivateFieldIn,\n __addDisposableResource,\n __disposeResources,\n};\n", "/**\n * Returns true if the object is a function.\n * @param value The value to check\n */\nexport function isFunction(value: any): value is (...args: any[]) => any {\n return typeof value === 'function';\n}\n", "/**\n * Used to create Error subclasses until the community moves away from ES5.\n *\n * This is because compiling from TypeScript down to ES5 has issues with subclassing Errors\n * as well as other built-in types: https://github.com/Microsoft/TypeScript/issues/12123\n *\n * @param createImpl A factory function to create the actual constructor implementation. The returned\n * function should be a named function that calls `_super` internally.\n */\nexport function createErrorClass(createImpl: (_super: any) => any): T {\n const _super = (instance: any) => {\n Error.call(instance);\n instance.stack = new Error().stack;\n };\n\n const ctorFunc = createImpl(_super);\n ctorFunc.prototype = Object.create(Error.prototype);\n ctorFunc.prototype.constructor = ctorFunc;\n return ctorFunc;\n}\n", "import { createErrorClass } from './createErrorClass';\n\nexport interface UnsubscriptionError extends Error {\n readonly errors: any[];\n}\n\nexport interface UnsubscriptionErrorCtor {\n /**\n * @deprecated Internal implementation detail. Do not construct error instances.\n * Cannot be tagged as internal: https://github.com/ReactiveX/rxjs/issues/6269\n */\n new (errors: any[]): UnsubscriptionError;\n}\n\n/**\n * An error thrown when one or more errors have occurred during the\n * `unsubscribe` of a {@link Subscription}.\n */\nexport const UnsubscriptionError: UnsubscriptionErrorCtor = createErrorClass(\n (_super) =>\n function UnsubscriptionErrorImpl(this: any, errors: (Error | string)[]) {\n _super(this);\n this.message = errors\n ? `${errors.length} errors occurred during unsubscription:\n${errors.map((err, i) => `${i + 1}) ${err.toString()}`).join('\\n ')}`\n : '';\n this.name = 'UnsubscriptionError';\n this.errors = errors;\n }\n);\n", "/**\n * Removes an item from an array, mutating it.\n * @param arr The array to remove the item from\n * @param item The item to remove\n */\nexport function arrRemove(arr: T[] | undefined | null, item: T) {\n if (arr) {\n const index = arr.indexOf(item);\n 0 <= index && arr.splice(index, 1);\n }\n}\n", "import { isFunction } from './util/isFunction';\nimport { UnsubscriptionError } from './util/UnsubscriptionError';\nimport { SubscriptionLike, TeardownLogic, Unsubscribable } from './types';\nimport { arrRemove } from './util/arrRemove';\n\n/**\n * Represents a disposable resource, such as the execution of an Observable. A\n * Subscription has one important method, `unsubscribe`, that takes no argument\n * and just disposes the resource held by the subscription.\n *\n * Additionally, subscriptions may be grouped together through the `add()`\n * method, which will attach a child Subscription to the current Subscription.\n * When a Subscription is unsubscribed, all its children (and its grandchildren)\n * will be unsubscribed as well.\n *\n * @class Subscription\n */\nexport class Subscription implements SubscriptionLike {\n /** @nocollapse */\n public static EMPTY = (() => {\n const empty = new Subscription();\n empty.closed = true;\n return empty;\n })();\n\n /**\n * A flag to indicate whether this Subscription has already been unsubscribed.\n */\n public closed = false;\n\n private _parentage: Subscription[] | Subscription | null = null;\n\n /**\n * The list of registered finalizers to execute upon unsubscription. Adding and removing from this\n * list occurs in the {@link #add} and {@link #remove} methods.\n */\n private _finalizers: Exclude[] | null = null;\n\n /**\n * @param initialTeardown A function executed first as part of the finalization\n * process that is kicked off when {@link #unsubscribe} is called.\n */\n constructor(private initialTeardown?: () => void) {}\n\n /**\n * Disposes the resources held by the subscription. May, for instance, cancel\n * an ongoing Observable execution or cancel any other type of work that\n * started when the Subscription was created.\n * @return {void}\n */\n unsubscribe(): void {\n let errors: any[] | undefined;\n\n if (!this.closed) {\n this.closed = true;\n\n // Remove this from it's parents.\n const { _parentage } = this;\n if (_parentage) {\n this._parentage = null;\n if (Array.isArray(_parentage)) {\n for (const parent of _parentage) {\n parent.remove(this);\n }\n } else {\n _parentage.remove(this);\n }\n }\n\n const { initialTeardown: initialFinalizer } = this;\n if (isFunction(initialFinalizer)) {\n try {\n initialFinalizer();\n } catch (e) {\n errors = e instanceof UnsubscriptionError ? e.errors : [e];\n }\n }\n\n const { _finalizers } = this;\n if (_finalizers) {\n this._finalizers = null;\n for (const finalizer of _finalizers) {\n try {\n execFinalizer(finalizer);\n } catch (err) {\n errors = errors ?? [];\n if (err instanceof UnsubscriptionError) {\n errors = [...errors, ...err.errors];\n } else {\n errors.push(err);\n }\n }\n }\n }\n\n if (errors) {\n throw new UnsubscriptionError(errors);\n }\n }\n }\n\n /**\n * Adds a finalizer to this subscription, so that finalization will be unsubscribed/called\n * when this subscription is unsubscribed. If this subscription is already {@link #closed},\n * because it has already been unsubscribed, then whatever finalizer is passed to it\n * will automatically be executed (unless the finalizer itself is also a closed subscription).\n *\n * Closed Subscriptions cannot be added as finalizers to any subscription. Adding a closed\n * subscription to a any subscription will result in no operation. (A noop).\n *\n * Adding a subscription to itself, or adding `null` or `undefined` will not perform any\n * operation at all. (A noop).\n *\n * `Subscription` instances that are added to this instance will automatically remove themselves\n * if they are unsubscribed. Functions and {@link Unsubscribable} objects that you wish to remove\n * will need to be removed manually with {@link #remove}\n *\n * @param teardown The finalization logic to add to this subscription.\n */\n add(teardown: TeardownLogic): void {\n // Only add the finalizer if it's not undefined\n // and don't add a subscription to itself.\n if (teardown && teardown !== this) {\n if (this.closed) {\n // If this subscription is already closed,\n // execute whatever finalizer is handed to it automatically.\n execFinalizer(teardown);\n } else {\n if (teardown instanceof Subscription) {\n // We don't add closed subscriptions, and we don't add the same subscription\n // twice. Subscription unsubscribe is idempotent.\n if (teardown.closed || teardown._hasParent(this)) {\n return;\n }\n teardown._addParent(this);\n }\n (this._finalizers = this._finalizers ?? []).push(teardown);\n }\n }\n }\n\n /**\n * Checks to see if a this subscription already has a particular parent.\n * This will signal that this subscription has already been added to the parent in question.\n * @param parent the parent to check for\n */\n private _hasParent(parent: Subscription) {\n const { _parentage } = this;\n return _parentage === parent || (Array.isArray(_parentage) && _parentage.includes(parent));\n }\n\n /**\n * Adds a parent to this subscription so it can be removed from the parent if it\n * unsubscribes on it's own.\n *\n * NOTE: THIS ASSUMES THAT {@link _hasParent} HAS ALREADY BEEN CHECKED.\n * @param parent The parent subscription to add\n */\n private _addParent(parent: Subscription) {\n const { _parentage } = this;\n this._parentage = Array.isArray(_parentage) ? (_parentage.push(parent), _parentage) : _parentage ? [_parentage, parent] : parent;\n }\n\n /**\n * Called on a child when it is removed via {@link #remove}.\n * @param parent The parent to remove\n */\n private _removeParent(parent: Subscription) {\n const { _parentage } = this;\n if (_parentage === parent) {\n this._parentage = null;\n } else if (Array.isArray(_parentage)) {\n arrRemove(_parentage, parent);\n }\n }\n\n /**\n * Removes a finalizer from this subscription that was previously added with the {@link #add} method.\n *\n * Note that `Subscription` instances, when unsubscribed, will automatically remove themselves\n * from every other `Subscription` they have been added to. This means that using the `remove` method\n * is not a common thing and should be used thoughtfully.\n *\n * If you add the same finalizer instance of a function or an unsubscribable object to a `Subscription` instance\n * more than once, you will need to call `remove` the same number of times to remove all instances.\n *\n * All finalizer instances are removed to free up memory upon unsubscription.\n *\n * @param teardown The finalizer to remove from this subscription\n */\n remove(teardown: Exclude): void {\n const { _finalizers } = this;\n _finalizers && arrRemove(_finalizers, teardown);\n\n if (teardown instanceof Subscription) {\n teardown._removeParent(this);\n }\n }\n}\n\nexport const EMPTY_SUBSCRIPTION = Subscription.EMPTY;\n\nexport function isSubscription(value: any): value is Subscription {\n return (\n value instanceof Subscription ||\n (value && 'closed' in value && isFunction(value.remove) && isFunction(value.add) && isFunction(value.unsubscribe))\n );\n}\n\nfunction execFinalizer(finalizer: Unsubscribable | (() => void)) {\n if (isFunction(finalizer)) {\n finalizer();\n } else {\n finalizer.unsubscribe();\n }\n}\n", "import { Subscriber } from './Subscriber';\nimport { ObservableNotification } from './types';\n\n/**\n * The {@link GlobalConfig} object for RxJS. It is used to configure things\n * like how to react on unhandled errors.\n */\nexport const config: GlobalConfig = {\n onUnhandledError: null,\n onStoppedNotification: null,\n Promise: undefined,\n useDeprecatedSynchronousErrorHandling: false,\n useDeprecatedNextContext: false,\n};\n\n/**\n * The global configuration object for RxJS, used to configure things\n * like how to react on unhandled errors. Accessible via {@link config}\n * object.\n */\nexport interface GlobalConfig {\n /**\n * A registration point for unhandled errors from RxJS. These are errors that\n * cannot were not handled by consuming code in the usual subscription path. For\n * example, if you have this configured, and you subscribe to an observable without\n * providing an error handler, errors from that subscription will end up here. This\n * will _always_ be called asynchronously on another job in the runtime. This is because\n * we do not want errors thrown in this user-configured handler to interfere with the\n * behavior of the library.\n */\n onUnhandledError: ((err: any) => void) | null;\n\n /**\n * A registration point for notifications that cannot be sent to subscribers because they\n * have completed, errored or have been explicitly unsubscribed. By default, next, complete\n * and error notifications sent to stopped subscribers are noops. However, sometimes callers\n * might want a different behavior. For example, with sources that attempt to report errors\n * to stopped subscribers, a caller can configure RxJS to throw an unhandled error instead.\n * This will _always_ be called asynchronously on another job in the runtime. This is because\n * we do not want errors thrown in this user-configured handler to interfere with the\n * behavior of the library.\n */\n onStoppedNotification: ((notification: ObservableNotification, subscriber: Subscriber) => void) | null;\n\n /**\n * The promise constructor used by default for {@link Observable#toPromise toPromise} and {@link Observable#forEach forEach}\n * methods.\n *\n * @deprecated As of version 8, RxJS will no longer support this sort of injection of a\n * Promise constructor. If you need a Promise implementation other than native promises,\n * please polyfill/patch Promise as you see appropriate. Will be removed in v8.\n */\n Promise?: PromiseConstructorLike;\n\n /**\n * If true, turns on synchronous error rethrowing, which is a deprecated behavior\n * in v6 and higher. This behavior enables bad patterns like wrapping a subscribe\n * call in a try/catch block. It also enables producer interference, a nasty bug\n * where a multicast can be broken for all observers by a downstream consumer with\n * an unhandled error. DO NOT USE THIS FLAG UNLESS IT'S NEEDED TO BUY TIME\n * FOR MIGRATION REASONS.\n *\n * @deprecated As of version 8, RxJS will no longer support synchronous throwing\n * of unhandled errors. All errors will be thrown on a separate call stack to prevent bad\n * behaviors described above. Will be removed in v8.\n */\n useDeprecatedSynchronousErrorHandling: boolean;\n\n /**\n * If true, enables an as-of-yet undocumented feature from v5: The ability to access\n * `unsubscribe()` via `this` context in `next` functions created in observers passed\n * to `subscribe`.\n *\n * This is being removed because the performance was severely problematic, and it could also cause\n * issues when types other than POJOs are passed to subscribe as subscribers, as they will likely have\n * their `this` context overwritten.\n *\n * @deprecated As of version 8, RxJS will no longer support altering the\n * context of next functions provided as part of an observer to Subscribe. Instead,\n * you will have access to a subscription or a signal or token that will allow you to do things like\n * unsubscribe and test closed status. Will be removed in v8.\n */\n useDeprecatedNextContext: boolean;\n}\n", "import type { TimerHandle } from './timerHandle';\ntype SetTimeoutFunction = (handler: () => void, timeout?: number, ...args: any[]) => TimerHandle;\ntype ClearTimeoutFunction = (handle: TimerHandle) => void;\n\ninterface TimeoutProvider {\n setTimeout: SetTimeoutFunction;\n clearTimeout: ClearTimeoutFunction;\n delegate:\n | {\n setTimeout: SetTimeoutFunction;\n clearTimeout: ClearTimeoutFunction;\n }\n | undefined;\n}\n\nexport const timeoutProvider: TimeoutProvider = {\n // When accessing the delegate, use the variable rather than `this` so that\n // the functions can be called without being bound to the provider.\n setTimeout(handler: () => void, timeout?: number, ...args) {\n const { delegate } = timeoutProvider;\n if (delegate?.setTimeout) {\n return delegate.setTimeout(handler, timeout, ...args);\n }\n return setTimeout(handler, timeout, ...args);\n },\n clearTimeout(handle) {\n const { delegate } = timeoutProvider;\n return (delegate?.clearTimeout || clearTimeout)(handle as any);\n },\n delegate: undefined,\n};\n", "import { config } from '../config';\nimport { timeoutProvider } from '../scheduler/timeoutProvider';\n\n/**\n * Handles an error on another job either with the user-configured {@link onUnhandledError},\n * or by throwing it on that new job so it can be picked up by `window.onerror`, `process.on('error')`, etc.\n *\n * This should be called whenever there is an error that is out-of-band with the subscription\n * or when an error hits a terminal boundary of the subscription and no error handler was provided.\n *\n * @param err the error to report\n */\nexport function reportUnhandledError(err: any) {\n timeoutProvider.setTimeout(() => {\n const { onUnhandledError } = config;\n if (onUnhandledError) {\n // Execute the user-configured error handler.\n onUnhandledError(err);\n } else {\n // Throw so it is picked up by the runtime's uncaught error mechanism.\n throw err;\n }\n });\n}\n", "/* tslint:disable:no-empty */\nexport function noop() { }\n", "import { CompleteNotification, NextNotification, ErrorNotification } from './types';\n\n/**\n * A completion object optimized for memory use and created to be the\n * same \"shape\" as other notifications in v8.\n * @internal\n */\nexport const COMPLETE_NOTIFICATION = (() => createNotification('C', undefined, undefined) as CompleteNotification)();\n\n/**\n * Internal use only. Creates an optimized error notification that is the same \"shape\"\n * as other notifications.\n * @internal\n */\nexport function errorNotification(error: any): ErrorNotification {\n return createNotification('E', undefined, error) as any;\n}\n\n/**\n * Internal use only. Creates an optimized next notification that is the same \"shape\"\n * as other notifications.\n * @internal\n */\nexport function nextNotification(value: T) {\n return createNotification('N', value, undefined) as NextNotification;\n}\n\n/**\n * Ensures that all notifications created internally have the same \"shape\" in v8.\n *\n * TODO: This is only exported to support a crazy legacy test in `groupBy`.\n * @internal\n */\nexport function createNotification(kind: 'N' | 'E' | 'C', value: any, error: any) {\n return {\n kind,\n value,\n error,\n };\n}\n", "import { config } from '../config';\n\nlet context: { errorThrown: boolean; error: any } | null = null;\n\n/**\n * Handles dealing with errors for super-gross mode. Creates a context, in which\n * any synchronously thrown errors will be passed to {@link captureError}. Which\n * will record the error such that it will be rethrown after the call back is complete.\n * TODO: Remove in v8\n * @param cb An immediately executed function.\n */\nexport function errorContext(cb: () => void) {\n if (config.useDeprecatedSynchronousErrorHandling) {\n const isRoot = !context;\n if (isRoot) {\n context = { errorThrown: false, error: null };\n }\n cb();\n if (isRoot) {\n const { errorThrown, error } = context!;\n context = null;\n if (errorThrown) {\n throw error;\n }\n }\n } else {\n // This is the general non-deprecated path for everyone that\n // isn't crazy enough to use super-gross mode (useDeprecatedSynchronousErrorHandling)\n cb();\n }\n}\n\n/**\n * Captures errors only in super-gross mode.\n * @param err the error to capture\n */\nexport function captureError(err: any) {\n if (config.useDeprecatedSynchronousErrorHandling && context) {\n context.errorThrown = true;\n context.error = err;\n }\n}\n", "import { isFunction } from './util/isFunction';\nimport { Observer, ObservableNotification } from './types';\nimport { isSubscription, Subscription } from './Subscription';\nimport { config } from './config';\nimport { reportUnhandledError } from './util/reportUnhandledError';\nimport { noop } from './util/noop';\nimport { nextNotification, errorNotification, COMPLETE_NOTIFICATION } from './NotificationFactories';\nimport { timeoutProvider } from './scheduler/timeoutProvider';\nimport { captureError } from './util/errorContext';\n\n/**\n * Implements the {@link Observer} interface and extends the\n * {@link Subscription} class. While the {@link Observer} is the public API for\n * consuming the values of an {@link Observable}, all Observers get converted to\n * a Subscriber, in order to provide Subscription-like capabilities such as\n * `unsubscribe`. Subscriber is a common type in RxJS, and crucial for\n * implementing operators, but it is rarely used as a public API.\n *\n * @class Subscriber\n */\nexport class Subscriber extends Subscription implements Observer {\n /**\n * A static factory for a Subscriber, given a (potentially partial) definition\n * of an Observer.\n * @param next The `next` callback of an Observer.\n * @param error The `error` callback of an\n * Observer.\n * @param complete The `complete` callback of an\n * Observer.\n * @return A Subscriber wrapping the (partially defined)\n * Observer represented by the given arguments.\n * @nocollapse\n * @deprecated Do not use. Will be removed in v8. There is no replacement for this\n * method, and there is no reason to be creating instances of `Subscriber` directly.\n * If you have a specific use case, please file an issue.\n */\n static create(next?: (x?: T) => void, error?: (e?: any) => void, complete?: () => void): Subscriber {\n return new SafeSubscriber(next, error, complete);\n }\n\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n protected isStopped: boolean = false;\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n protected destination: Subscriber | Observer; // this `any` is the escape hatch to erase extra type param (e.g. R)\n\n /**\n * @deprecated Internal implementation detail, do not use directly. Will be made internal in v8.\n * There is no reason to directly create an instance of Subscriber. This type is exported for typings reasons.\n */\n constructor(destination?: Subscriber | Observer) {\n super();\n if (destination) {\n this.destination = destination;\n // Automatically chain subscriptions together here.\n // if destination is a Subscription, then it is a Subscriber.\n if (isSubscription(destination)) {\n destination.add(this);\n }\n } else {\n this.destination = EMPTY_OBSERVER;\n }\n }\n\n /**\n * The {@link Observer} callback to receive notifications of type `next` from\n * the Observable, with a value. The Observable may call this method 0 or more\n * times.\n * @param {T} [value] The `next` value.\n * @return {void}\n */\n next(value?: T): void {\n if (this.isStopped) {\n handleStoppedNotification(nextNotification(value), this);\n } else {\n this._next(value!);\n }\n }\n\n /**\n * The {@link Observer} callback to receive notifications of type `error` from\n * the Observable, with an attached `Error`. Notifies the Observer that\n * the Observable has experienced an error condition.\n * @param {any} [err] The `error` exception.\n * @return {void}\n */\n error(err?: any): void {\n if (this.isStopped) {\n handleStoppedNotification(errorNotification(err), this);\n } else {\n this.isStopped = true;\n this._error(err);\n }\n }\n\n /**\n * The {@link Observer} callback to receive a valueless notification of type\n * `complete` from the Observable. Notifies the Observer that the Observable\n * has finished sending push-based notifications.\n * @return {void}\n */\n complete(): void {\n if (this.isStopped) {\n handleStoppedNotification(COMPLETE_NOTIFICATION, this);\n } else {\n this.isStopped = true;\n this._complete();\n }\n }\n\n unsubscribe(): void {\n if (!this.closed) {\n this.isStopped = true;\n super.unsubscribe();\n this.destination = null!;\n }\n }\n\n protected _next(value: T): void {\n this.destination.next(value);\n }\n\n protected _error(err: any): void {\n try {\n this.destination.error(err);\n } finally {\n this.unsubscribe();\n }\n }\n\n protected _complete(): void {\n try {\n this.destination.complete();\n } finally {\n this.unsubscribe();\n }\n }\n}\n\n/**\n * This bind is captured here because we want to be able to have\n * compatibility with monoid libraries that tend to use a method named\n * `bind`. In particular, a library called Monio requires this.\n */\nconst _bind = Function.prototype.bind;\n\nfunction bind any>(fn: Fn, thisArg: any): Fn {\n return _bind.call(fn, thisArg);\n}\n\n/**\n * Internal optimization only, DO NOT EXPOSE.\n * @internal\n */\nclass ConsumerObserver implements Observer {\n constructor(private partialObserver: Partial>) {}\n\n next(value: T): void {\n const { partialObserver } = this;\n if (partialObserver.next) {\n try {\n partialObserver.next(value);\n } catch (error) {\n handleUnhandledError(error);\n }\n }\n }\n\n error(err: any): void {\n const { partialObserver } = this;\n if (partialObserver.error) {\n try {\n partialObserver.error(err);\n } catch (error) {\n handleUnhandledError(error);\n }\n } else {\n handleUnhandledError(err);\n }\n }\n\n complete(): void {\n const { partialObserver } = this;\n if (partialObserver.complete) {\n try {\n partialObserver.complete();\n } catch (error) {\n handleUnhandledError(error);\n }\n }\n }\n}\n\nexport class SafeSubscriber extends Subscriber {\n constructor(\n observerOrNext?: Partial> | ((value: T) => void) | null,\n error?: ((e?: any) => void) | null,\n complete?: (() => void) | null\n ) {\n super();\n\n let partialObserver: Partial>;\n if (isFunction(observerOrNext) || !observerOrNext) {\n // The first argument is a function, not an observer. The next\n // two arguments *could* be observers, or they could be empty.\n partialObserver = {\n next: (observerOrNext ?? undefined) as (((value: T) => void) | undefined),\n error: error ?? undefined,\n complete: complete ?? undefined,\n };\n } else {\n // The first argument is a partial observer.\n let context: any;\n if (this && config.useDeprecatedNextContext) {\n // This is a deprecated path that made `this.unsubscribe()` available in\n // next handler functions passed to subscribe. This only exists behind a flag\n // now, as it is *very* slow.\n context = Object.create(observerOrNext);\n context.unsubscribe = () => this.unsubscribe();\n partialObserver = {\n next: observerOrNext.next && bind(observerOrNext.next, context),\n error: observerOrNext.error && bind(observerOrNext.error, context),\n complete: observerOrNext.complete && bind(observerOrNext.complete, context),\n };\n } else {\n // The \"normal\" path. Just use the partial observer directly.\n partialObserver = observerOrNext;\n }\n }\n\n // Wrap the partial observer to ensure it's a full observer, and\n // make sure proper error handling is accounted for.\n this.destination = new ConsumerObserver(partialObserver);\n }\n}\n\nfunction handleUnhandledError(error: any) {\n if (config.useDeprecatedSynchronousErrorHandling) {\n captureError(error);\n } else {\n // Ideal path, we report this as an unhandled error,\n // which is thrown on a new call stack.\n reportUnhandledError(error);\n }\n}\n\n/**\n * An error handler used when no error handler was supplied\n * to the SafeSubscriber -- meaning no error handler was supplied\n * do the `subscribe` call on our observable.\n * @param err The error to handle\n */\nfunction defaultErrorHandler(err: any) {\n throw err;\n}\n\n/**\n * A handler for notifications that cannot be sent to a stopped subscriber.\n * @param notification The notification being sent\n * @param subscriber The stopped subscriber\n */\nfunction handleStoppedNotification(notification: ObservableNotification, subscriber: Subscriber) {\n const { onStoppedNotification } = config;\n onStoppedNotification && timeoutProvider.setTimeout(() => onStoppedNotification(notification, subscriber));\n}\n\n/**\n * The observer used as a stub for subscriptions where the user did not\n * pass any arguments to `subscribe`. Comes with the default error handling\n * behavior.\n */\nexport const EMPTY_OBSERVER: Readonly> & { closed: true } = {\n closed: true,\n next: noop,\n error: defaultErrorHandler,\n complete: noop,\n};\n", "/**\n * Symbol.observable or a string \"@@observable\". Used for interop\n *\n * @deprecated We will no longer be exporting this symbol in upcoming versions of RxJS.\n * Instead polyfill and use Symbol.observable directly *or* use https://www.npmjs.com/package/symbol-observable\n */\nexport const observable: string | symbol = (() => (typeof Symbol === 'function' && Symbol.observable) || '@@observable')();\n", "/**\n * This function takes one parameter and just returns it. Simply put,\n * this is like `(x: T): T => x`.\n *\n * ## Examples\n *\n * This is useful in some cases when using things like `mergeMap`\n *\n * ```ts\n * import { interval, take, map, range, mergeMap, identity } from 'rxjs';\n *\n * const source$ = interval(1000).pipe(take(5));\n *\n * const result$ = source$.pipe(\n * map(i => range(i)),\n * mergeMap(identity) // same as mergeMap(x => x)\n * );\n *\n * result$.subscribe({\n * next: console.log\n * });\n * ```\n *\n * Or when you want to selectively apply an operator\n *\n * ```ts\n * import { interval, take, identity } from 'rxjs';\n *\n * const shouldLimit = () => Math.random() < 0.5;\n *\n * const source$ = interval(1000);\n *\n * const result$ = source$.pipe(shouldLimit() ? take(5) : identity);\n *\n * result$.subscribe({\n * next: console.log\n * });\n * ```\n *\n * @param x Any value that is returned by this function\n * @returns The value passed as the first parameter to this function\n */\nexport function identity(x: T): T {\n return x;\n}\n", "import { identity } from './identity';\nimport { UnaryFunction } from '../types';\n\nexport function pipe(): typeof identity;\nexport function pipe(fn1: UnaryFunction): UnaryFunction;\nexport function pipe(fn1: UnaryFunction, fn2: UnaryFunction): UnaryFunction;\nexport function pipe(fn1: UnaryFunction, fn2: UnaryFunction, fn3: UnaryFunction): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction,\n fn7: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction,\n fn7: UnaryFunction,\n fn8: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction,\n fn7: UnaryFunction,\n fn8: UnaryFunction,\n fn9: UnaryFunction\n): UnaryFunction;\nexport function pipe(\n fn1: UnaryFunction,\n fn2: UnaryFunction,\n fn3: UnaryFunction,\n fn4: UnaryFunction,\n fn5: UnaryFunction,\n fn6: UnaryFunction,\n fn7: UnaryFunction,\n fn8: UnaryFunction,\n fn9: UnaryFunction,\n ...fns: UnaryFunction[]\n): UnaryFunction;\n\n/**\n * pipe() can be called on one or more functions, each of which can take one argument (\"UnaryFunction\")\n * and uses it to return a value.\n * It returns a function that takes one argument, passes it to the first UnaryFunction, and then\n * passes the result to the next one, passes that result to the next one, and so on. \n */\nexport function pipe(...fns: Array>): UnaryFunction {\n return pipeFromArray(fns);\n}\n\n/** @internal */\nexport function pipeFromArray(fns: Array>): UnaryFunction {\n if (fns.length === 0) {\n return identity as UnaryFunction;\n }\n\n if (fns.length === 1) {\n return fns[0];\n }\n\n return function piped(input: T): R {\n return fns.reduce((prev: any, fn: UnaryFunction) => fn(prev), input as any);\n };\n}\n", "import { Operator } from './Operator';\nimport { SafeSubscriber, Subscriber } from './Subscriber';\nimport { isSubscription, Subscription } from './Subscription';\nimport { TeardownLogic, OperatorFunction, Subscribable, Observer } from './types';\nimport { observable as Symbol_observable } from './symbol/observable';\nimport { pipeFromArray } from './util/pipe';\nimport { config } from './config';\nimport { isFunction } from './util/isFunction';\nimport { errorContext } from './util/errorContext';\n\n/**\n * A representation of any set of values over any amount of time. This is the most basic building block\n * of RxJS.\n *\n * @class Observable\n */\nexport class Observable implements Subscribable {\n /**\n * @deprecated Internal implementation detail, do not use directly. Will be made internal in v8.\n */\n source: Observable | undefined;\n\n /**\n * @deprecated Internal implementation detail, do not use directly. Will be made internal in v8.\n */\n operator: Operator | undefined;\n\n /**\n * @constructor\n * @param {Function} subscribe the function that is called when the Observable is\n * initially subscribed to. This function is given a Subscriber, to which new values\n * can be `next`ed, or an `error` method can be called to raise an error, or\n * `complete` can be called to notify of a successful completion.\n */\n constructor(subscribe?: (this: Observable, subscriber: Subscriber) => TeardownLogic) {\n if (subscribe) {\n this._subscribe = subscribe;\n }\n }\n\n // HACK: Since TypeScript inherits static properties too, we have to\n // fight against TypeScript here so Subject can have a different static create signature\n /**\n * Creates a new Observable by calling the Observable constructor\n * @owner Observable\n * @method create\n * @param {Function} subscribe? the subscriber function to be passed to the Observable constructor\n * @return {Observable} a new observable\n * @nocollapse\n * @deprecated Use `new Observable()` instead. Will be removed in v8.\n */\n static create: (...args: any[]) => any = (subscribe?: (subscriber: Subscriber) => TeardownLogic) => {\n return new Observable(subscribe);\n };\n\n /**\n * Creates a new Observable, with this Observable instance as the source, and the passed\n * operator defined as the new observable's operator.\n * @method lift\n * @param operator the operator defining the operation to take on the observable\n * @return a new observable with the Operator applied\n * @deprecated Internal implementation detail, do not use directly. Will be made internal in v8.\n * If you have implemented an operator using `lift`, it is recommended that you create an\n * operator by simply returning `new Observable()` directly. See \"Creating new operators from\n * scratch\" section here: https://rxjs.dev/guide/operators\n */\n lift(operator?: Operator): Observable {\n const observable = new Observable();\n observable.source = this;\n observable.operator = operator;\n return observable;\n }\n\n subscribe(observerOrNext?: Partial> | ((value: T) => void)): Subscription;\n /** @deprecated Instead of passing separate callback arguments, use an observer argument. Signatures taking separate callback arguments will be removed in v8. Details: https://rxjs.dev/deprecations/subscribe-arguments */\n subscribe(next?: ((value: T) => void) | null, error?: ((error: any) => void) | null, complete?: (() => void) | null): Subscription;\n /**\n * Invokes an execution of an Observable and registers Observer handlers for notifications it will emit.\n *\n * Use it when you have all these Observables, but still nothing is happening.\n *\n * `subscribe` is not a regular operator, but a method that calls Observable's internal `subscribe` function. It\n * might be for example a function that you passed to Observable's constructor, but most of the time it is\n * a library implementation, which defines what will be emitted by an Observable, and when it be will emitted. This means\n * that calling `subscribe` is actually the moment when Observable starts its work, not when it is created, as it is often\n * the thought.\n *\n * Apart from starting the execution of an Observable, this method allows you to listen for values\n * that an Observable emits, as well as for when it completes or errors. You can achieve this in two\n * of the following ways.\n *\n * The first way is creating an object that implements {@link Observer} interface. It should have methods\n * defined by that interface, but note that it should be just a regular JavaScript object, which you can create\n * yourself in any way you want (ES6 class, classic function constructor, object literal etc.). In particular, do\n * not attempt to use any RxJS implementation details to create Observers - you don't need them. Remember also\n * that your object does not have to implement all methods. If you find yourself creating a method that doesn't\n * do anything, you can simply omit it. Note however, if the `error` method is not provided and an error happens,\n * it will be thrown asynchronously. Errors thrown asynchronously cannot be caught using `try`/`catch`. Instead,\n * use the {@link onUnhandledError} configuration option or use a runtime handler (like `window.onerror` or\n * `process.on('error)`) to be notified of unhandled errors. Because of this, it's recommended that you provide\n * an `error` method to avoid missing thrown errors.\n *\n * The second way is to give up on Observer object altogether and simply provide callback functions in place of its methods.\n * This means you can provide three functions as arguments to `subscribe`, where the first function is equivalent\n * of a `next` method, the second of an `error` method and the third of a `complete` method. Just as in case of an Observer,\n * if you do not need to listen for something, you can omit a function by passing `undefined` or `null`,\n * since `subscribe` recognizes these functions by where they were placed in function call. When it comes\n * to the `error` function, as with an Observer, if not provided, errors emitted by an Observable will be thrown asynchronously.\n *\n * You can, however, subscribe with no parameters at all. This may be the case where you're not interested in terminal events\n * and you also handled emissions internally by using operators (e.g. using `tap`).\n *\n * Whichever style of calling `subscribe` you use, in both cases it returns a Subscription object.\n * This object allows you to call `unsubscribe` on it, which in turn will stop the work that an Observable does and will clean\n * up all resources that an Observable used. Note that cancelling a subscription will not call `complete` callback\n * provided to `subscribe` function, which is reserved for a regular completion signal that comes from an Observable.\n *\n * Remember that callbacks provided to `subscribe` are not guaranteed to be called asynchronously.\n * It is an Observable itself that decides when these functions will be called. For example {@link of}\n * by default emits all its values synchronously. Always check documentation for how given Observable\n * will behave when subscribed and if its default behavior can be modified with a `scheduler`.\n *\n * #### Examples\n *\n * Subscribe with an {@link guide/observer Observer}\n *\n * ```ts\n * import { of } from 'rxjs';\n *\n * const sumObserver = {\n * sum: 0,\n * next(value) {\n * console.log('Adding: ' + value);\n * this.sum = this.sum + value;\n * },\n * error() {\n * // We actually could just remove this method,\n * // since we do not really care about errors right now.\n * },\n * complete() {\n * console.log('Sum equals: ' + this.sum);\n * }\n * };\n *\n * of(1, 2, 3) // Synchronously emits 1, 2, 3 and then completes.\n * .subscribe(sumObserver);\n *\n * // Logs:\n * // 'Adding: 1'\n * // 'Adding: 2'\n * // 'Adding: 3'\n * // 'Sum equals: 6'\n * ```\n *\n * Subscribe with functions ({@link deprecations/subscribe-arguments deprecated})\n *\n * ```ts\n * import { of } from 'rxjs'\n *\n * let sum = 0;\n *\n * of(1, 2, 3).subscribe(\n * value => {\n * console.log('Adding: ' + value);\n * sum = sum + value;\n * },\n * undefined,\n * () => console.log('Sum equals: ' + sum)\n * );\n *\n * // Logs:\n * // 'Adding: 1'\n * // 'Adding: 2'\n * // 'Adding: 3'\n * // 'Sum equals: 6'\n * ```\n *\n * Cancel a subscription\n *\n * ```ts\n * import { interval } from 'rxjs';\n *\n * const subscription = interval(1000).subscribe({\n * next(num) {\n * console.log(num)\n * },\n * complete() {\n * // Will not be called, even when cancelling subscription.\n * console.log('completed!');\n * }\n * });\n *\n * setTimeout(() => {\n * subscription.unsubscribe();\n * console.log('unsubscribed!');\n * }, 2500);\n *\n * // Logs:\n * // 0 after 1s\n * // 1 after 2s\n * // 'unsubscribed!' after 2.5s\n * ```\n *\n * @param {Observer|Function} observerOrNext (optional) Either an observer with methods to be called,\n * or the first of three possible handlers, which is the handler for each value emitted from the subscribed\n * Observable.\n * @param {Function} error (optional) A handler for a terminal event resulting from an error. If no error handler is provided,\n * the error will be thrown asynchronously as unhandled.\n * @param {Function} complete (optional) A handler for a terminal event resulting from successful completion.\n * @return {Subscription} a subscription reference to the registered handlers\n * @method subscribe\n */\n subscribe(\n observerOrNext?: Partial> | ((value: T) => void) | null,\n error?: ((error: any) => void) | null,\n complete?: (() => void) | null\n ): Subscription {\n const subscriber = isSubscriber(observerOrNext) ? observerOrNext : new SafeSubscriber(observerOrNext, error, complete);\n\n errorContext(() => {\n const { operator, source } = this;\n subscriber.add(\n operator\n ? // We're dealing with a subscription in the\n // operator chain to one of our lifted operators.\n operator.call(subscriber, source)\n : source\n ? // If `source` has a value, but `operator` does not, something that\n // had intimate knowledge of our API, like our `Subject`, must have\n // set it. We're going to just call `_subscribe` directly.\n this._subscribe(subscriber)\n : // In all other cases, we're likely wrapping a user-provided initializer\n // function, so we need to catch errors and handle them appropriately.\n this._trySubscribe(subscriber)\n );\n });\n\n return subscriber;\n }\n\n /** @internal */\n protected _trySubscribe(sink: Subscriber): TeardownLogic {\n try {\n return this._subscribe(sink);\n } catch (err) {\n // We don't need to return anything in this case,\n // because it's just going to try to `add()` to a subscription\n // above.\n sink.error(err);\n }\n }\n\n /**\n * Used as a NON-CANCELLABLE means of subscribing to an observable, for use with\n * APIs that expect promises, like `async/await`. You cannot unsubscribe from this.\n *\n * **WARNING**: Only use this with observables you *know* will complete. If the source\n * observable does not complete, you will end up with a promise that is hung up, and\n * potentially all of the state of an async function hanging out in memory. To avoid\n * this situation, look into adding something like {@link timeout}, {@link take},\n * {@link takeWhile}, or {@link takeUntil} amongst others.\n *\n * #### Example\n *\n * ```ts\n * import { interval, take } from 'rxjs';\n *\n * const source$ = interval(1000).pipe(take(4));\n *\n * async function getTotal() {\n * let total = 0;\n *\n * await source$.forEach(value => {\n * total += value;\n * console.log('observable -> ' + value);\n * });\n *\n * return total;\n * }\n *\n * getTotal().then(\n * total => console.log('Total: ' + total)\n * );\n *\n * // Expected:\n * // 'observable -> 0'\n * // 'observable -> 1'\n * // 'observable -> 2'\n * // 'observable -> 3'\n * // 'Total: 6'\n * ```\n *\n * @param next a handler for each value emitted by the observable\n * @return a promise that either resolves on observable completion or\n * rejects with the handled error\n */\n forEach(next: (value: T) => void): Promise;\n\n /**\n * @param next a handler for each value emitted by the observable\n * @param promiseCtor a constructor function used to instantiate the Promise\n * @return a promise that either resolves on observable completion or\n * rejects with the handled error\n * @deprecated Passing a Promise constructor will no longer be available\n * in upcoming versions of RxJS. This is because it adds weight to the library, for very\n * little benefit. If you need this functionality, it is recommended that you either\n * polyfill Promise, or you create an adapter to convert the returned native promise\n * to whatever promise implementation you wanted. Will be removed in v8.\n */\n forEach(next: (value: T) => void, promiseCtor: PromiseConstructorLike): Promise;\n\n forEach(next: (value: T) => void, promiseCtor?: PromiseConstructorLike): Promise {\n promiseCtor = getPromiseCtor(promiseCtor);\n\n return new promiseCtor((resolve, reject) => {\n const subscriber = new SafeSubscriber({\n next: (value) => {\n try {\n next(value);\n } catch (err) {\n reject(err);\n subscriber.unsubscribe();\n }\n },\n error: reject,\n complete: resolve,\n });\n this.subscribe(subscriber);\n }) as Promise;\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): TeardownLogic {\n return this.source?.subscribe(subscriber);\n }\n\n /**\n * An interop point defined by the es7-observable spec https://github.com/zenparsing/es-observable\n * @method Symbol.observable\n * @return {Observable} this instance of the observable\n */\n [Symbol_observable]() {\n return this;\n }\n\n /* tslint:disable:max-line-length */\n pipe(): Observable;\n pipe(op1: OperatorFunction): Observable;\n pipe(op1: OperatorFunction, op2: OperatorFunction): Observable;\n pipe(op1: OperatorFunction, op2: OperatorFunction, op3: OperatorFunction): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction,\n op7: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction,\n op7: OperatorFunction,\n op8: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction,\n op7: OperatorFunction,\n op8: OperatorFunction,\n op9: OperatorFunction\n ): Observable;\n pipe(\n op1: OperatorFunction,\n op2: OperatorFunction,\n op3: OperatorFunction,\n op4: OperatorFunction,\n op5: OperatorFunction,\n op6: OperatorFunction,\n op7: OperatorFunction,\n op8: OperatorFunction,\n op9: OperatorFunction,\n ...operations: OperatorFunction[]\n ): Observable;\n /* tslint:enable:max-line-length */\n\n /**\n * Used to stitch together functional operators into a chain.\n * @method pipe\n * @return {Observable} the Observable result of all of the operators having\n * been called in the order they were passed in.\n *\n * ## Example\n *\n * ```ts\n * import { interval, filter, map, scan } from 'rxjs';\n *\n * interval(1000)\n * .pipe(\n * filter(x => x % 2 === 0),\n * map(x => x + x),\n * scan((acc, x) => acc + x)\n * )\n * .subscribe(x => console.log(x));\n * ```\n */\n pipe(...operations: OperatorFunction[]): Observable {\n return pipeFromArray(operations)(this);\n }\n\n /* tslint:disable:max-line-length */\n /** @deprecated Replaced with {@link firstValueFrom} and {@link lastValueFrom}. Will be removed in v8. Details: https://rxjs.dev/deprecations/to-promise */\n toPromise(): Promise;\n /** @deprecated Replaced with {@link firstValueFrom} and {@link lastValueFrom}. Will be removed in v8. Details: https://rxjs.dev/deprecations/to-promise */\n toPromise(PromiseCtor: typeof Promise): Promise;\n /** @deprecated Replaced with {@link firstValueFrom} and {@link lastValueFrom}. Will be removed in v8. Details: https://rxjs.dev/deprecations/to-promise */\n toPromise(PromiseCtor: PromiseConstructorLike): Promise;\n /* tslint:enable:max-line-length */\n\n /**\n * Subscribe to this Observable and get a Promise resolving on\n * `complete` with the last emission (if any).\n *\n * **WARNING**: Only use this with observables you *know* will complete. If the source\n * observable does not complete, you will end up with a promise that is hung up, and\n * potentially all of the state of an async function hanging out in memory. To avoid\n * this situation, look into adding something like {@link timeout}, {@link take},\n * {@link takeWhile}, or {@link takeUntil} amongst others.\n *\n * @method toPromise\n * @param [promiseCtor] a constructor function used to instantiate\n * the Promise\n * @return A Promise that resolves with the last value emit, or\n * rejects on an error. If there were no emissions, Promise\n * resolves with undefined.\n * @deprecated Replaced with {@link firstValueFrom} and {@link lastValueFrom}. Will be removed in v8. Details: https://rxjs.dev/deprecations/to-promise\n */\n toPromise(promiseCtor?: PromiseConstructorLike): Promise {\n promiseCtor = getPromiseCtor(promiseCtor);\n\n return new promiseCtor((resolve, reject) => {\n let value: T | undefined;\n this.subscribe(\n (x: T) => (value = x),\n (err: any) => reject(err),\n () => resolve(value)\n );\n }) as Promise;\n }\n}\n\n/**\n * Decides between a passed promise constructor from consuming code,\n * A default configured promise constructor, and the native promise\n * constructor and returns it. If nothing can be found, it will throw\n * an error.\n * @param promiseCtor The optional promise constructor to passed by consuming code\n */\nfunction getPromiseCtor(promiseCtor: PromiseConstructorLike | undefined) {\n return promiseCtor ?? config.Promise ?? Promise;\n}\n\nfunction isObserver(value: any): value is Observer {\n return value && isFunction(value.next) && isFunction(value.error) && isFunction(value.complete);\n}\n\nfunction isSubscriber(value: any): value is Subscriber {\n return (value && value instanceof Subscriber) || (isObserver(value) && isSubscription(value));\n}\n", "import { Observable } from '../Observable';\nimport { Subscriber } from '../Subscriber';\nimport { OperatorFunction } from '../types';\nimport { isFunction } from './isFunction';\n\n/**\n * Used to determine if an object is an Observable with a lift function.\n */\nexport function hasLift(source: any): source is { lift: InstanceType['lift'] } {\n return isFunction(source?.lift);\n}\n\n/**\n * Creates an `OperatorFunction`. Used to define operators throughout the library in a concise way.\n * @param init The logic to connect the liftedSource to the subscriber at the moment of subscription.\n */\nexport function operate(\n init: (liftedSource: Observable, subscriber: Subscriber) => (() => void) | void\n): OperatorFunction {\n return (source: Observable) => {\n if (hasLift(source)) {\n return source.lift(function (this: Subscriber, liftedSource: Observable) {\n try {\n return init(liftedSource, this);\n } catch (err) {\n this.error(err);\n }\n });\n }\n throw new TypeError('Unable to lift unknown Observable type');\n };\n}\n", "import { Subscriber } from '../Subscriber';\n\n/**\n * Creates an instance of an `OperatorSubscriber`.\n * @param destination The downstream subscriber.\n * @param onNext Handles next values, only called if this subscriber is not stopped or closed. Any\n * error that occurs in this function is caught and sent to the `error` method of this subscriber.\n * @param onError Handles errors from the subscription, any errors that occur in this handler are caught\n * and send to the `destination` error handler.\n * @param onComplete Handles completion notification from the subscription. Any errors that occur in\n * this handler are sent to the `destination` error handler.\n * @param onFinalize Additional teardown logic here. This will only be called on teardown if the\n * subscriber itself is not already closed. This is called after all other teardown logic is executed.\n */\nexport function createOperatorSubscriber(\n destination: Subscriber,\n onNext?: (value: T) => void,\n onComplete?: () => void,\n onError?: (err: any) => void,\n onFinalize?: () => void\n): Subscriber {\n return new OperatorSubscriber(destination, onNext, onComplete, onError, onFinalize);\n}\n\n/**\n * A generic helper for allowing operators to be created with a Subscriber and\n * use closures to capture necessary state from the operator function itself.\n */\nexport class OperatorSubscriber extends Subscriber {\n /**\n * Creates an instance of an `OperatorSubscriber`.\n * @param destination The downstream subscriber.\n * @param onNext Handles next values, only called if this subscriber is not stopped or closed. Any\n * error that occurs in this function is caught and sent to the `error` method of this subscriber.\n * @param onError Handles errors from the subscription, any errors that occur in this handler are caught\n * and send to the `destination` error handler.\n * @param onComplete Handles completion notification from the subscription. Any errors that occur in\n * this handler are sent to the `destination` error handler.\n * @param onFinalize Additional finalization logic here. This will only be called on finalization if the\n * subscriber itself is not already closed. This is called after all other finalization logic is executed.\n * @param shouldUnsubscribe An optional check to see if an unsubscribe call should truly unsubscribe.\n * NOTE: This currently **ONLY** exists to support the strange behavior of {@link groupBy}, where unsubscription\n * to the resulting observable does not actually disconnect from the source if there are active subscriptions\n * to any grouped observable. (DO NOT EXPOSE OR USE EXTERNALLY!!!)\n */\n constructor(\n destination: Subscriber,\n onNext?: (value: T) => void,\n onComplete?: () => void,\n onError?: (err: any) => void,\n private onFinalize?: () => void,\n private shouldUnsubscribe?: () => boolean\n ) {\n // It's important - for performance reasons - that all of this class's\n // members are initialized and that they are always initialized in the same\n // order. This will ensure that all OperatorSubscriber instances have the\n // same hidden class in V8. This, in turn, will help keep the number of\n // hidden classes involved in property accesses within the base class as\n // low as possible. If the number of hidden classes involved exceeds four,\n // the property accesses will become megamorphic and performance penalties\n // will be incurred - i.e. inline caches won't be used.\n //\n // The reasons for ensuring all instances have the same hidden class are\n // further discussed in this blog post from Benedikt Meurer:\n // https://benediktmeurer.de/2018/03/23/impact-of-polymorphism-on-component-based-frameworks-like-react/\n super(destination);\n this._next = onNext\n ? function (this: OperatorSubscriber, value: T) {\n try {\n onNext(value);\n } catch (err) {\n destination.error(err);\n }\n }\n : super._next;\n this._error = onError\n ? function (this: OperatorSubscriber, err: any) {\n try {\n onError(err);\n } catch (err) {\n // Send any errors that occur down stream.\n destination.error(err);\n } finally {\n // Ensure finalization.\n this.unsubscribe();\n }\n }\n : super._error;\n this._complete = onComplete\n ? function (this: OperatorSubscriber) {\n try {\n onComplete();\n } catch (err) {\n // Send any errors that occur down stream.\n destination.error(err);\n } finally {\n // Ensure finalization.\n this.unsubscribe();\n }\n }\n : super._complete;\n }\n\n unsubscribe() {\n if (!this.shouldUnsubscribe || this.shouldUnsubscribe()) {\n const { closed } = this;\n super.unsubscribe();\n // Execute additional teardown if we have any and we didn't already do so.\n !closed && this.onFinalize?.();\n }\n }\n}\n", "import { Subscription } from '../Subscription';\n\ninterface AnimationFrameProvider {\n schedule(callback: FrameRequestCallback): Subscription;\n requestAnimationFrame: typeof requestAnimationFrame;\n cancelAnimationFrame: typeof cancelAnimationFrame;\n delegate:\n | {\n requestAnimationFrame: typeof requestAnimationFrame;\n cancelAnimationFrame: typeof cancelAnimationFrame;\n }\n | undefined;\n}\n\nexport const animationFrameProvider: AnimationFrameProvider = {\n // When accessing the delegate, use the variable rather than `this` so that\n // the functions can be called without being bound to the provider.\n schedule(callback) {\n let request = requestAnimationFrame;\n let cancel: typeof cancelAnimationFrame | undefined = cancelAnimationFrame;\n const { delegate } = animationFrameProvider;\n if (delegate) {\n request = delegate.requestAnimationFrame;\n cancel = delegate.cancelAnimationFrame;\n }\n const handle = request((timestamp) => {\n // Clear the cancel function. The request has been fulfilled, so\n // attempting to cancel the request upon unsubscription would be\n // pointless.\n cancel = undefined;\n callback(timestamp);\n });\n return new Subscription(() => cancel?.(handle));\n },\n requestAnimationFrame(...args) {\n const { delegate } = animationFrameProvider;\n return (delegate?.requestAnimationFrame || requestAnimationFrame)(...args);\n },\n cancelAnimationFrame(...args) {\n const { delegate } = animationFrameProvider;\n return (delegate?.cancelAnimationFrame || cancelAnimationFrame)(...args);\n },\n delegate: undefined,\n};\n", "import { createErrorClass } from './createErrorClass';\n\nexport interface ObjectUnsubscribedError extends Error {}\n\nexport interface ObjectUnsubscribedErrorCtor {\n /**\n * @deprecated Internal implementation detail. Do not construct error instances.\n * Cannot be tagged as internal: https://github.com/ReactiveX/rxjs/issues/6269\n */\n new (): ObjectUnsubscribedError;\n}\n\n/**\n * An error thrown when an action is invalid because the object has been\n * unsubscribed.\n *\n * @see {@link Subject}\n * @see {@link BehaviorSubject}\n *\n * @class ObjectUnsubscribedError\n */\nexport const ObjectUnsubscribedError: ObjectUnsubscribedErrorCtor = createErrorClass(\n (_super) =>\n function ObjectUnsubscribedErrorImpl(this: any) {\n _super(this);\n this.name = 'ObjectUnsubscribedError';\n this.message = 'object unsubscribed';\n }\n);\n", "import { Operator } from './Operator';\nimport { Observable } from './Observable';\nimport { Subscriber } from './Subscriber';\nimport { Subscription, EMPTY_SUBSCRIPTION } from './Subscription';\nimport { Observer, SubscriptionLike, TeardownLogic } from './types';\nimport { ObjectUnsubscribedError } from './util/ObjectUnsubscribedError';\nimport { arrRemove } from './util/arrRemove';\nimport { errorContext } from './util/errorContext';\n\n/**\n * A Subject is a special type of Observable that allows values to be\n * multicasted to many Observers. Subjects are like EventEmitters.\n *\n * Every Subject is an Observable and an Observer. You can subscribe to a\n * Subject, and you can call next to feed values as well as error and complete.\n */\nexport class Subject extends Observable implements SubscriptionLike {\n closed = false;\n\n private currentObservers: Observer[] | null = null;\n\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n observers: Observer[] = [];\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n isStopped = false;\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n hasError = false;\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n thrownError: any = null;\n\n /**\n * Creates a \"subject\" by basically gluing an observer to an observable.\n *\n * @nocollapse\n * @deprecated Recommended you do not use. Will be removed at some point in the future. Plans for replacement still under discussion.\n */\n static create: (...args: any[]) => any = (destination: Observer, source: Observable): AnonymousSubject => {\n return new AnonymousSubject(destination, source);\n };\n\n constructor() {\n // NOTE: This must be here to obscure Observable's constructor.\n super();\n }\n\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n lift(operator: Operator): Observable {\n const subject = new AnonymousSubject(this, this);\n subject.operator = operator as any;\n return subject as any;\n }\n\n /** @internal */\n protected _throwIfClosed() {\n if (this.closed) {\n throw new ObjectUnsubscribedError();\n }\n }\n\n next(value: T) {\n errorContext(() => {\n this._throwIfClosed();\n if (!this.isStopped) {\n if (!this.currentObservers) {\n this.currentObservers = Array.from(this.observers);\n }\n for (const observer of this.currentObservers) {\n observer.next(value);\n }\n }\n });\n }\n\n error(err: any) {\n errorContext(() => {\n this._throwIfClosed();\n if (!this.isStopped) {\n this.hasError = this.isStopped = true;\n this.thrownError = err;\n const { observers } = this;\n while (observers.length) {\n observers.shift()!.error(err);\n }\n }\n });\n }\n\n complete() {\n errorContext(() => {\n this._throwIfClosed();\n if (!this.isStopped) {\n this.isStopped = true;\n const { observers } = this;\n while (observers.length) {\n observers.shift()!.complete();\n }\n }\n });\n }\n\n unsubscribe() {\n this.isStopped = this.closed = true;\n this.observers = this.currentObservers = null!;\n }\n\n get observed() {\n return this.observers?.length > 0;\n }\n\n /** @internal */\n protected _trySubscribe(subscriber: Subscriber): TeardownLogic {\n this._throwIfClosed();\n return super._trySubscribe(subscriber);\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): Subscription {\n this._throwIfClosed();\n this._checkFinalizedStatuses(subscriber);\n return this._innerSubscribe(subscriber);\n }\n\n /** @internal */\n protected _innerSubscribe(subscriber: Subscriber) {\n const { hasError, isStopped, observers } = this;\n if (hasError || isStopped) {\n return EMPTY_SUBSCRIPTION;\n }\n this.currentObservers = null;\n observers.push(subscriber);\n return new Subscription(() => {\n this.currentObservers = null;\n arrRemove(observers, subscriber);\n });\n }\n\n /** @internal */\n protected _checkFinalizedStatuses(subscriber: Subscriber) {\n const { hasError, thrownError, isStopped } = this;\n if (hasError) {\n subscriber.error(thrownError);\n } else if (isStopped) {\n subscriber.complete();\n }\n }\n\n /**\n * Creates a new Observable with this Subject as the source. You can do this\n * to create custom Observer-side logic of the Subject and conceal it from\n * code that uses the Observable.\n * @return {Observable} Observable that the Subject casts to\n */\n asObservable(): Observable {\n const observable: any = new Observable();\n observable.source = this;\n return observable;\n }\n}\n\n/**\n * @class AnonymousSubject\n */\nexport class AnonymousSubject extends Subject {\n constructor(\n /** @deprecated Internal implementation detail, do not use directly. Will be made internal in v8. */\n public destination?: Observer,\n source?: Observable\n ) {\n super();\n this.source = source;\n }\n\n next(value: T) {\n this.destination?.next?.(value);\n }\n\n error(err: any) {\n this.destination?.error?.(err);\n }\n\n complete() {\n this.destination?.complete?.();\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): Subscription {\n return this.source?.subscribe(subscriber) ?? EMPTY_SUBSCRIPTION;\n }\n}\n", "import { Subject } from './Subject';\nimport { Subscriber } from './Subscriber';\nimport { Subscription } from './Subscription';\n\n/**\n * A variant of Subject that requires an initial value and emits its current\n * value whenever it is subscribed to.\n *\n * @class BehaviorSubject\n */\nexport class BehaviorSubject extends Subject {\n constructor(private _value: T) {\n super();\n }\n\n get value(): T {\n return this.getValue();\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): Subscription {\n const subscription = super._subscribe(subscriber);\n !subscription.closed && subscriber.next(this._value);\n return subscription;\n }\n\n getValue(): T {\n const { hasError, thrownError, _value } = this;\n if (hasError) {\n throw thrownError;\n }\n this._throwIfClosed();\n return _value;\n }\n\n next(value: T): void {\n super.next((this._value = value));\n }\n}\n", "import { TimestampProvider } from '../types';\n\ninterface DateTimestampProvider extends TimestampProvider {\n delegate: TimestampProvider | undefined;\n}\n\nexport const dateTimestampProvider: DateTimestampProvider = {\n now() {\n // Use the variable rather than `this` so that the function can be called\n // without being bound to the provider.\n return (dateTimestampProvider.delegate || Date).now();\n },\n delegate: undefined,\n};\n", "import { Subject } from './Subject';\nimport { TimestampProvider } from './types';\nimport { Subscriber } from './Subscriber';\nimport { Subscription } from './Subscription';\nimport { dateTimestampProvider } from './scheduler/dateTimestampProvider';\n\n/**\n * A variant of {@link Subject} that \"replays\" old values to new subscribers by emitting them when they first subscribe.\n *\n * `ReplaySubject` has an internal buffer that will store a specified number of values that it has observed. Like `Subject`,\n * `ReplaySubject` \"observes\" values by having them passed to its `next` method. When it observes a value, it will store that\n * value for a time determined by the configuration of the `ReplaySubject`, as passed to its constructor.\n *\n * When a new subscriber subscribes to the `ReplaySubject` instance, it will synchronously emit all values in its buffer in\n * a First-In-First-Out (FIFO) manner. The `ReplaySubject` will also complete, if it has observed completion; and it will\n * error if it has observed an error.\n *\n * There are two main configuration items to be concerned with:\n *\n * 1. `bufferSize` - This will determine how many items are stored in the buffer, defaults to infinite.\n * 2. `windowTime` - The amount of time to hold a value in the buffer before removing it from the buffer.\n *\n * Both configurations may exist simultaneously. So if you would like to buffer a maximum of 3 values, as long as the values\n * are less than 2 seconds old, you could do so with a `new ReplaySubject(3, 2000)`.\n *\n * ### Differences with BehaviorSubject\n *\n * `BehaviorSubject` is similar to `new ReplaySubject(1)`, with a couple of exceptions:\n *\n * 1. `BehaviorSubject` comes \"primed\" with a single value upon construction.\n * 2. `ReplaySubject` will replay values, even after observing an error, where `BehaviorSubject` will not.\n *\n * @see {@link Subject}\n * @see {@link BehaviorSubject}\n * @see {@link shareReplay}\n */\nexport class ReplaySubject extends Subject {\n private _buffer: (T | number)[] = [];\n private _infiniteTimeWindow = true;\n\n /**\n * @param bufferSize The size of the buffer to replay on subscription\n * @param windowTime The amount of time the buffered items will stay buffered\n * @param timestampProvider An object with a `now()` method that provides the current timestamp. This is used to\n * calculate the amount of time something has been buffered.\n */\n constructor(\n private _bufferSize = Infinity,\n private _windowTime = Infinity,\n private _timestampProvider: TimestampProvider = dateTimestampProvider\n ) {\n super();\n this._infiniteTimeWindow = _windowTime === Infinity;\n this._bufferSize = Math.max(1, _bufferSize);\n this._windowTime = Math.max(1, _windowTime);\n }\n\n next(value: T): void {\n const { isStopped, _buffer, _infiniteTimeWindow, _timestampProvider, _windowTime } = this;\n if (!isStopped) {\n _buffer.push(value);\n !_infiniteTimeWindow && _buffer.push(_timestampProvider.now() + _windowTime);\n }\n this._trimBuffer();\n super.next(value);\n }\n\n /** @internal */\n protected _subscribe(subscriber: Subscriber): Subscription {\n this._throwIfClosed();\n this._trimBuffer();\n\n const subscription = this._innerSubscribe(subscriber);\n\n const { _infiniteTimeWindow, _buffer } = this;\n // We use a copy here, so reentrant code does not mutate our array while we're\n // emitting it to a new subscriber.\n const copy = _buffer.slice();\n for (let i = 0; i < copy.length && !subscriber.closed; i += _infiniteTimeWindow ? 1 : 2) {\n subscriber.next(copy[i] as T);\n }\n\n this._checkFinalizedStatuses(subscriber);\n\n return subscription;\n }\n\n private _trimBuffer() {\n const { _bufferSize, _timestampProvider, _buffer, _infiniteTimeWindow } = this;\n // If we don't have an infinite buffer size, and we're over the length,\n // use splice to truncate the old buffer values off. Note that we have to\n // double the size for instances where we're not using an infinite time window\n // because we're storing the values and the timestamps in the same array.\n const adjustedBufferSize = (_infiniteTimeWindow ? 1 : 2) * _bufferSize;\n _bufferSize < Infinity && adjustedBufferSize < _buffer.length && _buffer.splice(0, _buffer.length - adjustedBufferSize);\n\n // Now, if we're not in an infinite time window, remove all values where the time is\n // older than what is allowed.\n if (!_infiniteTimeWindow) {\n const now = _timestampProvider.now();\n let last = 0;\n // Search the array for the first timestamp that isn't expired and\n // truncate the buffer up to that point.\n for (let i = 1; i < _buffer.length && (_buffer[i] as number) <= now; i += 2) {\n last = i;\n }\n last && _buffer.splice(0, last + 1);\n }\n }\n}\n", "import { Scheduler } from '../Scheduler';\nimport { Subscription } from '../Subscription';\nimport { SchedulerAction } from '../types';\n\n/**\n * A unit of work to be executed in a `scheduler`. An action is typically\n * created from within a {@link SchedulerLike} and an RxJS user does not need to concern\n * themselves about creating and manipulating an Action.\n *\n * ```ts\n * class Action extends Subscription {\n * new (scheduler: Scheduler, work: (state?: T) => void);\n * schedule(state?: T, delay: number = 0): Subscription;\n * }\n * ```\n *\n * @class Action\n */\nexport class Action extends Subscription {\n constructor(scheduler: Scheduler, work: (this: SchedulerAction, state?: T) => void) {\n super();\n }\n /**\n * Schedules this action on its parent {@link SchedulerLike} for execution. May be passed\n * some context object, `state`. May happen at some point in the future,\n * according to the `delay` parameter, if specified.\n * @param {T} [state] Some contextual data that the `work` function uses when\n * called by the Scheduler.\n * @param {number} [delay] Time to wait before executing the work, where the\n * time unit is implicit and defined by the Scheduler.\n * @return {void}\n */\n public schedule(state?: T, delay: number = 0): Subscription {\n return this;\n }\n}\n", "import type { TimerHandle } from './timerHandle';\ntype SetIntervalFunction = (handler: () => void, timeout?: number, ...args: any[]) => TimerHandle;\ntype ClearIntervalFunction = (handle: TimerHandle) => void;\n\ninterface IntervalProvider {\n setInterval: SetIntervalFunction;\n clearInterval: ClearIntervalFunction;\n delegate:\n | {\n setInterval: SetIntervalFunction;\n clearInterval: ClearIntervalFunction;\n }\n | undefined;\n}\n\nexport const intervalProvider: IntervalProvider = {\n // When accessing the delegate, use the variable rather than `this` so that\n // the functions can be called without being bound to the provider.\n setInterval(handler: () => void, timeout?: number, ...args) {\n const { delegate } = intervalProvider;\n if (delegate?.setInterval) {\n return delegate.setInterval(handler, timeout, ...args);\n }\n return setInterval(handler, timeout, ...args);\n },\n clearInterval(handle) {\n const { delegate } = intervalProvider;\n return (delegate?.clearInterval || clearInterval)(handle as any);\n },\n delegate: undefined,\n};\n", "import { Action } from './Action';\nimport { SchedulerAction } from '../types';\nimport { Subscription } from '../Subscription';\nimport { AsyncScheduler } from './AsyncScheduler';\nimport { intervalProvider } from './intervalProvider';\nimport { arrRemove } from '../util/arrRemove';\nimport { TimerHandle } from './timerHandle';\n\nexport class AsyncAction extends Action {\n public id: TimerHandle | undefined;\n public state?: T;\n // @ts-ignore: Property has no initializer and is not definitely assigned\n public delay: number;\n protected pending: boolean = false;\n\n constructor(protected scheduler: AsyncScheduler, protected work: (this: SchedulerAction, state?: T) => void) {\n super(scheduler, work);\n }\n\n public schedule(state?: T, delay: number = 0): Subscription {\n if (this.closed) {\n return this;\n }\n\n // Always replace the current state with the new state.\n this.state = state;\n\n const id = this.id;\n const scheduler = this.scheduler;\n\n //\n // Important implementation note:\n //\n // Actions only execute once by default, unless rescheduled from within the\n // scheduled callback. This allows us to implement single and repeat\n // actions via the same code path, without adding API surface area, as well\n // as mimic traditional recursion but across asynchronous boundaries.\n //\n // However, JS runtimes and timers distinguish between intervals achieved by\n // serial `setTimeout` calls vs. a single `setInterval` call. An interval of\n // serial `setTimeout` calls can be individually delayed, which delays\n // scheduling the next `setTimeout`, and so on. `setInterval` attempts to\n // guarantee the interval callback will be invoked more precisely to the\n // interval period, regardless of load.\n //\n // Therefore, we use `setInterval` to schedule single and repeat actions.\n // If the action reschedules itself with the same delay, the interval is not\n // canceled. If the action doesn't reschedule, or reschedules with a\n // different delay, the interval will be canceled after scheduled callback\n // execution.\n //\n if (id != null) {\n this.id = this.recycleAsyncId(scheduler, id, delay);\n }\n\n // Set the pending flag indicating that this action has been scheduled, or\n // has recursively rescheduled itself.\n this.pending = true;\n\n this.delay = delay;\n // If this action has already an async Id, don't request a new one.\n this.id = this.id ?? this.requestAsyncId(scheduler, this.id, delay);\n\n return this;\n }\n\n protected requestAsyncId(scheduler: AsyncScheduler, _id?: TimerHandle, delay: number = 0): TimerHandle {\n return intervalProvider.setInterval(scheduler.flush.bind(scheduler, this), delay);\n }\n\n protected recycleAsyncId(_scheduler: AsyncScheduler, id?: TimerHandle, delay: number | null = 0): TimerHandle | undefined {\n // If this action is rescheduled with the same delay time, don't clear the interval id.\n if (delay != null && this.delay === delay && this.pending === false) {\n return id;\n }\n // Otherwise, if the action's delay time is different from the current delay,\n // or the action has been rescheduled before it's executed, clear the interval id\n if (id != null) {\n intervalProvider.clearInterval(id);\n }\n\n return undefined;\n }\n\n /**\n * Immediately executes this action and the `work` it contains.\n * @return {any}\n */\n public execute(state: T, delay: number): any {\n if (this.closed) {\n return new Error('executing a cancelled action');\n }\n\n this.pending = false;\n const error = this._execute(state, delay);\n if (error) {\n return error;\n } else if (this.pending === false && this.id != null) {\n // Dequeue if the action didn't reschedule itself. Don't call\n // unsubscribe(), because the action could reschedule later.\n // For example:\n // ```\n // scheduler.schedule(function doWork(counter) {\n // /* ... I'm a busy worker bee ... */\n // var originalAction = this;\n // /* wait 100ms before rescheduling the action */\n // setTimeout(function () {\n // originalAction.schedule(counter + 1);\n // }, 100);\n // }, 1000);\n // ```\n this.id = this.recycleAsyncId(this.scheduler, this.id, null);\n }\n }\n\n protected _execute(state: T, _delay: number): any {\n let errored: boolean = false;\n let errorValue: any;\n try {\n this.work(state);\n } catch (e) {\n errored = true;\n // HACK: Since code elsewhere is relying on the \"truthiness\" of the\n // return here, we can't have it return \"\" or 0 or false.\n // TODO: Clean this up when we refactor schedulers mid-version-8 or so.\n errorValue = e ? e : new Error('Scheduled action threw falsy error');\n }\n if (errored) {\n this.unsubscribe();\n return errorValue;\n }\n }\n\n unsubscribe() {\n if (!this.closed) {\n const { id, scheduler } = this;\n const { actions } = scheduler;\n\n this.work = this.state = this.scheduler = null!;\n this.pending = false;\n\n arrRemove(actions, this);\n if (id != null) {\n this.id = this.recycleAsyncId(scheduler, id, null);\n }\n\n this.delay = null!;\n super.unsubscribe();\n }\n }\n}\n", "import { Action } from './scheduler/Action';\nimport { Subscription } from './Subscription';\nimport { SchedulerLike, SchedulerAction } from './types';\nimport { dateTimestampProvider } from './scheduler/dateTimestampProvider';\n\n/**\n * An execution context and a data structure to order tasks and schedule their\n * execution. Provides a notion of (potentially virtual) time, through the\n * `now()` getter method.\n *\n * Each unit of work in a Scheduler is called an `Action`.\n *\n * ```ts\n * class Scheduler {\n * now(): number;\n * schedule(work, delay?, state?): Subscription;\n * }\n * ```\n *\n * @class Scheduler\n * @deprecated Scheduler is an internal implementation detail of RxJS, and\n * should not be used directly. Rather, create your own class and implement\n * {@link SchedulerLike}. Will be made internal in v8.\n */\nexport class Scheduler implements SchedulerLike {\n public static now: () => number = dateTimestampProvider.now;\n\n constructor(private schedulerActionCtor: typeof Action, now: () => number = Scheduler.now) {\n this.now = now;\n }\n\n /**\n * A getter method that returns a number representing the current time\n * (at the time this function was called) according to the scheduler's own\n * internal clock.\n * @return {number} A number that represents the current time. May or may not\n * have a relation to wall-clock time. May or may not refer to a time unit\n * (e.g. milliseconds).\n */\n public now: () => number;\n\n /**\n * Schedules a function, `work`, for execution. May happen at some point in\n * the future, according to the `delay` parameter, if specified. May be passed\n * some context object, `state`, which will be passed to the `work` function.\n *\n * The given arguments will be processed an stored as an Action object in a\n * queue of actions.\n *\n * @param {function(state: ?T): ?Subscription} work A function representing a\n * task, or some unit of work to be executed by the Scheduler.\n * @param {number} [delay] Time to wait before executing the work, where the\n * time unit is implicit and defined by the Scheduler itself.\n * @param {T} [state] Some contextual data that the `work` function uses when\n * called by the Scheduler.\n * @return {Subscription} A subscription in order to be able to unsubscribe\n * the scheduled work.\n */\n public schedule(work: (this: SchedulerAction, state?: T) => void, delay: number = 0, state?: T): Subscription {\n return new this.schedulerActionCtor(this, work).schedule(state, delay);\n }\n}\n", "import { Scheduler } from '../Scheduler';\nimport { Action } from './Action';\nimport { AsyncAction } from './AsyncAction';\nimport { TimerHandle } from './timerHandle';\n\nexport class AsyncScheduler extends Scheduler {\n public actions: Array> = [];\n /**\n * A flag to indicate whether the Scheduler is currently executing a batch of\n * queued actions.\n * @type {boolean}\n * @internal\n */\n public _active: boolean = false;\n /**\n * An internal ID used to track the latest asynchronous task such as those\n * coming from `setTimeout`, `setInterval`, `requestAnimationFrame`, and\n * others.\n * @type {any}\n * @internal\n */\n public _scheduled: TimerHandle | undefined;\n\n constructor(SchedulerAction: typeof Action, now: () => number = Scheduler.now) {\n super(SchedulerAction, now);\n }\n\n public flush(action: AsyncAction): void {\n const { actions } = this;\n\n if (this._active) {\n actions.push(action);\n return;\n }\n\n let error: any;\n this._active = true;\n\n do {\n if ((error = action.execute(action.state, action.delay))) {\n break;\n }\n } while ((action = actions.shift()!)); // exhaust the scheduler queue\n\n this._active = false;\n\n if (error) {\n while ((action = actions.shift()!)) {\n action.unsubscribe();\n }\n throw error;\n }\n }\n}\n", "import { AsyncAction } from './AsyncAction';\nimport { AsyncScheduler } from './AsyncScheduler';\n\n/**\n *\n * Async Scheduler\n *\n * Schedule task as if you used setTimeout(task, duration)\n *\n * `async` scheduler schedules tasks asynchronously, by putting them on the JavaScript\n * event loop queue. It is best used to delay tasks in time or to schedule tasks repeating\n * in intervals.\n *\n * If you just want to \"defer\" task, that is to perform it right after currently\n * executing synchronous code ends (commonly achieved by `setTimeout(deferredTask, 0)`),\n * better choice will be the {@link asapScheduler} scheduler.\n *\n * ## Examples\n * Use async scheduler to delay task\n * ```ts\n * import { asyncScheduler } from 'rxjs';\n *\n * const task = () => console.log('it works!');\n *\n * asyncScheduler.schedule(task, 2000);\n *\n * // After 2 seconds logs:\n * // \"it works!\"\n * ```\n *\n * Use async scheduler to repeat task in intervals\n * ```ts\n * import { asyncScheduler } from 'rxjs';\n *\n * function task(state) {\n * console.log(state);\n * this.schedule(state + 1, 1000); // `this` references currently executing Action,\n * // which we reschedule with new state and delay\n * }\n *\n * asyncScheduler.schedule(task, 3000, 0);\n *\n * // Logs:\n * // 0 after 3s\n * // 1 after 4s\n * // 2 after 5s\n * // 3 after 6s\n * ```\n */\n\nexport const asyncScheduler = new AsyncScheduler(AsyncAction);\n\n/**\n * @deprecated Renamed to {@link asyncScheduler}. Will be removed in v8.\n */\nexport const async = asyncScheduler;\n", "import { AsyncAction } from './AsyncAction';\nimport { Subscription } from '../Subscription';\nimport { QueueScheduler } from './QueueScheduler';\nimport { SchedulerAction } from '../types';\nimport { TimerHandle } from './timerHandle';\n\nexport class QueueAction extends AsyncAction {\n constructor(protected scheduler: QueueScheduler, protected work: (this: SchedulerAction, state?: T) => void) {\n super(scheduler, work);\n }\n\n public schedule(state?: T, delay: number = 0): Subscription {\n if (delay > 0) {\n return super.schedule(state, delay);\n }\n this.delay = delay;\n this.state = state;\n this.scheduler.flush(this);\n return this;\n }\n\n public execute(state: T, delay: number): any {\n return delay > 0 || this.closed ? super.execute(state, delay) : this._execute(state, delay);\n }\n\n protected requestAsyncId(scheduler: QueueScheduler, id?: TimerHandle, delay: number = 0): TimerHandle {\n // If delay exists and is greater than 0, or if the delay is null (the\n // action wasn't rescheduled) but was originally scheduled as an async\n // action, then recycle as an async action.\n\n if ((delay != null && delay > 0) || (delay == null && this.delay > 0)) {\n return super.requestAsyncId(scheduler, id, delay);\n }\n\n // Otherwise flush the scheduler starting with this action.\n scheduler.flush(this);\n\n // HACK: In the past, this was returning `void`. However, `void` isn't a valid\n // `TimerHandle`, and generally the return value here isn't really used. So the\n // compromise is to return `0` which is both \"falsy\" and a valid `TimerHandle`,\n // as opposed to refactoring every other instanceo of `requestAsyncId`.\n return 0;\n }\n}\n", "import { AsyncScheduler } from './AsyncScheduler';\n\nexport class QueueScheduler extends AsyncScheduler {\n}\n", "import { QueueAction } from './QueueAction';\nimport { QueueScheduler } from './QueueScheduler';\n\n/**\n *\n * Queue Scheduler\n *\n * Put every next task on a queue, instead of executing it immediately\n *\n * `queue` scheduler, when used with delay, behaves the same as {@link asyncScheduler} scheduler.\n *\n * When used without delay, it schedules given task synchronously - executes it right when\n * it is scheduled. However when called recursively, that is when inside the scheduled task,\n * another task is scheduled with queue scheduler, instead of executing immediately as well,\n * that task will be put on a queue and wait for current one to finish.\n *\n * This means that when you execute task with `queue` scheduler, you are sure it will end\n * before any other task scheduled with that scheduler will start.\n *\n * ## Examples\n * Schedule recursively first, then do something\n * ```ts\n * import { queueScheduler } from 'rxjs';\n *\n * queueScheduler.schedule(() => {\n * queueScheduler.schedule(() => console.log('second')); // will not happen now, but will be put on a queue\n *\n * console.log('first');\n * });\n *\n * // Logs:\n * // \"first\"\n * // \"second\"\n * ```\n *\n * Reschedule itself recursively\n * ```ts\n * import { queueScheduler } from 'rxjs';\n *\n * queueScheduler.schedule(function(state) {\n * if (state !== 0) {\n * console.log('before', state);\n * this.schedule(state - 1); // `this` references currently executing Action,\n * // which we reschedule with new state\n * console.log('after', state);\n * }\n * }, 0, 3);\n *\n * // In scheduler that runs recursively, you would expect:\n * // \"before\", 3\n * // \"before\", 2\n * // \"before\", 1\n * // \"after\", 1\n * // \"after\", 2\n * // \"after\", 3\n *\n * // But with queue it logs:\n * // \"before\", 3\n * // \"after\", 3\n * // \"before\", 2\n * // \"after\", 2\n * // \"before\", 1\n * // \"after\", 1\n * ```\n */\n\nexport const queueScheduler = new QueueScheduler(QueueAction);\n\n/**\n * @deprecated Renamed to {@link queueScheduler}. Will be removed in v8.\n */\nexport const queue = queueScheduler;\n", "import { AsyncAction } from './AsyncAction';\nimport { AnimationFrameScheduler } from './AnimationFrameScheduler';\nimport { SchedulerAction } from '../types';\nimport { animationFrameProvider } from './animationFrameProvider';\nimport { TimerHandle } from './timerHandle';\n\nexport class AnimationFrameAction extends AsyncAction {\n constructor(protected scheduler: AnimationFrameScheduler, protected work: (this: SchedulerAction, state?: T) => void) {\n super(scheduler, work);\n }\n\n protected requestAsyncId(scheduler: AnimationFrameScheduler, id?: TimerHandle, delay: number = 0): TimerHandle {\n // If delay is greater than 0, request as an async action.\n if (delay !== null && delay > 0) {\n return super.requestAsyncId(scheduler, id, delay);\n }\n // Push the action to the end of the scheduler queue.\n scheduler.actions.push(this);\n // If an animation frame has already been requested, don't request another\n // one. If an animation frame hasn't been requested yet, request one. Return\n // the current animation frame request id.\n return scheduler._scheduled || (scheduler._scheduled = animationFrameProvider.requestAnimationFrame(() => scheduler.flush(undefined)));\n }\n\n protected recycleAsyncId(scheduler: AnimationFrameScheduler, id?: TimerHandle, delay: number = 0): TimerHandle | undefined {\n // If delay exists and is greater than 0, or if the delay is null (the\n // action wasn't rescheduled) but was originally scheduled as an async\n // action, then recycle as an async action.\n if (delay != null ? delay > 0 : this.delay > 0) {\n return super.recycleAsyncId(scheduler, id, delay);\n }\n // If the scheduler queue has no remaining actions with the same async id,\n // cancel the requested animation frame and set the scheduled flag to\n // undefined so the next AnimationFrameAction will request its own.\n const { actions } = scheduler;\n if (id != null && actions[actions.length - 1]?.id !== id) {\n animationFrameProvider.cancelAnimationFrame(id as number);\n scheduler._scheduled = undefined;\n }\n // Return undefined so the action knows to request a new async id if it's rescheduled.\n return undefined;\n }\n}\n", "import { AsyncAction } from './AsyncAction';\nimport { AsyncScheduler } from './AsyncScheduler';\n\nexport class AnimationFrameScheduler extends AsyncScheduler {\n public flush(action?: AsyncAction): void {\n this._active = true;\n // The async id that effects a call to flush is stored in _scheduled.\n // Before executing an action, it's necessary to check the action's async\n // id to determine whether it's supposed to be executed in the current\n // flush.\n // Previous implementations of this method used a count to determine this,\n // but that was unsound, as actions that are unsubscribed - i.e. cancelled -\n // are removed from the actions array and that can shift actions that are\n // scheduled to be executed in a subsequent flush into positions at which\n // they are executed within the current flush.\n const flushId = this._scheduled;\n this._scheduled = undefined;\n\n const { actions } = this;\n let error: any;\n action = action || actions.shift()!;\n\n do {\n if ((error = action.execute(action.state, action.delay))) {\n break;\n }\n } while ((action = actions[0]) && action.id === flushId && actions.shift());\n\n this._active = false;\n\n if (error) {\n while ((action = actions[0]) && action.id === flushId && actions.shift()) {\n action.unsubscribe();\n }\n throw error;\n }\n }\n}\n", "import { AnimationFrameAction } from './AnimationFrameAction';\nimport { AnimationFrameScheduler } from './AnimationFrameScheduler';\n\n/**\n *\n * Animation Frame Scheduler\n *\n * Perform task when `window.requestAnimationFrame` would fire\n *\n * When `animationFrame` scheduler is used with delay, it will fall back to {@link asyncScheduler} scheduler\n * behaviour.\n *\n * Without delay, `animationFrame` scheduler can be used to create smooth browser animations.\n * It makes sure scheduled task will happen just before next browser content repaint,\n * thus performing animations as efficiently as possible.\n *\n * ## Example\n * Schedule div height animation\n * ```ts\n * // html:
\n * import { animationFrameScheduler } from 'rxjs';\n *\n * const div = document.querySelector('div');\n *\n * animationFrameScheduler.schedule(function(height) {\n * div.style.height = height + \"px\";\n *\n * this.schedule(height + 1); // `this` references currently executing Action,\n * // which we reschedule with new state\n * }, 0, 0);\n *\n * // You will see a div element growing in height\n * ```\n */\n\nexport const animationFrameScheduler = new AnimationFrameScheduler(AnimationFrameAction);\n\n/**\n * @deprecated Renamed to {@link animationFrameScheduler}. Will be removed in v8.\n */\nexport const animationFrame = animationFrameScheduler;\n", "import { Observable } from '../Observable';\nimport { SchedulerLike } from '../types';\n\n/**\n * A simple Observable that emits no items to the Observer and immediately\n * emits a complete notification.\n *\n * Just emits 'complete', and nothing else.\n *\n * ![](empty.png)\n *\n * A simple Observable that only emits the complete notification. It can be used\n * for composing with other Observables, such as in a {@link mergeMap}.\n *\n * ## Examples\n *\n * Log complete notification\n *\n * ```ts\n * import { EMPTY } from 'rxjs';\n *\n * EMPTY.subscribe({\n * next: () => console.log('Next'),\n * complete: () => console.log('Complete!')\n * });\n *\n * // Outputs\n * // Complete!\n * ```\n *\n * Emit the number 7, then complete\n *\n * ```ts\n * import { EMPTY, startWith } from 'rxjs';\n *\n * const result = EMPTY.pipe(startWith(7));\n * result.subscribe(x => console.log(x));\n *\n * // Outputs\n * // 7\n * ```\n *\n * Map and flatten only odd numbers to the sequence `'a'`, `'b'`, `'c'`\n *\n * ```ts\n * import { interval, mergeMap, of, EMPTY } from 'rxjs';\n *\n * const interval$ = interval(1000);\n * const result = interval$.pipe(\n * mergeMap(x => x % 2 === 1 ? of('a', 'b', 'c') : EMPTY),\n * );\n * result.subscribe(x => console.log(x));\n *\n * // Results in the following to the console:\n * // x is equal to the count on the interval, e.g. (0, 1, 2, 3, ...)\n * // x will occur every 1000ms\n * // if x % 2 is equal to 1, print a, b, c (each on its own)\n * // if x % 2 is not equal to 1, nothing will be output\n * ```\n *\n * @see {@link Observable}\n * @see {@link NEVER}\n * @see {@link of}\n * @see {@link throwError}\n */\nexport const EMPTY = new Observable((subscriber) => subscriber.complete());\n\n/**\n * @param scheduler A {@link SchedulerLike} to use for scheduling\n * the emission of the complete notification.\n * @deprecated Replaced with the {@link EMPTY} constant or {@link scheduled} (e.g. `scheduled([], scheduler)`). Will be removed in v8.\n */\nexport function empty(scheduler?: SchedulerLike) {\n return scheduler ? emptyScheduled(scheduler) : EMPTY;\n}\n\nfunction emptyScheduled(scheduler: SchedulerLike) {\n return new Observable((subscriber) => scheduler.schedule(() => subscriber.complete()));\n}\n", "import { SchedulerLike } from '../types';\nimport { isFunction } from './isFunction';\n\nexport function isScheduler(value: any): value is SchedulerLike {\n return value && isFunction(value.schedule);\n}\n", "import { SchedulerLike } from '../types';\nimport { isFunction } from './isFunction';\nimport { isScheduler } from './isScheduler';\n\nfunction last(arr: T[]): T | undefined {\n return arr[arr.length - 1];\n}\n\nexport function popResultSelector(args: any[]): ((...args: unknown[]) => unknown) | undefined {\n return isFunction(last(args)) ? args.pop() : undefined;\n}\n\nexport function popScheduler(args: any[]): SchedulerLike | undefined {\n return isScheduler(last(args)) ? args.pop() : undefined;\n}\n\nexport function popNumber(args: any[], defaultValue: number): number {\n return typeof last(args) === 'number' ? args.pop()! : defaultValue;\n}\n", "export const isArrayLike = ((x: any): x is ArrayLike => x && typeof x.length === 'number' && typeof x !== 'function');", "import { isFunction } from \"./isFunction\";\n\n/**\n * Tests to see if the object is \"thennable\".\n * @param value the object to test\n */\nexport function isPromise(value: any): value is PromiseLike {\n return isFunction(value?.then);\n}\n", "import { InteropObservable } from '../types';\nimport { observable as Symbol_observable } from '../symbol/observable';\nimport { isFunction } from './isFunction';\n\n/** Identifies an input as being Observable (but not necessary an Rx Observable) */\nexport function isInteropObservable(input: any): input is InteropObservable {\n return isFunction(input[Symbol_observable]);\n}\n", "import { isFunction } from './isFunction';\n\nexport function isAsyncIterable(obj: any): obj is AsyncIterable {\n return Symbol.asyncIterator && isFunction(obj?.[Symbol.asyncIterator]);\n}\n", "/**\n * Creates the TypeError to throw if an invalid object is passed to `from` or `scheduled`.\n * @param input The object that was passed.\n */\nexport function createInvalidObservableTypeError(input: any) {\n // TODO: We should create error codes that can be looked up, so this can be less verbose.\n return new TypeError(\n `You provided ${\n input !== null && typeof input === 'object' ? 'an invalid object' : `'${input}'`\n } where a stream was expected. You can provide an Observable, Promise, ReadableStream, Array, AsyncIterable, or Iterable.`\n );\n}\n", "export function getSymbolIterator(): symbol {\n if (typeof Symbol !== 'function' || !Symbol.iterator) {\n return '@@iterator' as any;\n }\n\n return Symbol.iterator;\n}\n\nexport const iterator = getSymbolIterator();\n", "import { iterator as Symbol_iterator } from '../symbol/iterator';\nimport { isFunction } from './isFunction';\n\n/** Identifies an input as being an Iterable */\nexport function isIterable(input: any): input is Iterable {\n return isFunction(input?.[Symbol_iterator]);\n}\n", "import { ReadableStreamLike } from '../types';\nimport { isFunction } from './isFunction';\n\nexport async function* readableStreamLikeToAsyncGenerator(readableStream: ReadableStreamLike): AsyncGenerator {\n const reader = readableStream.getReader();\n try {\n while (true) {\n const { value, done } = await reader.read();\n if (done) {\n return;\n }\n yield value!;\n }\n } finally {\n reader.releaseLock();\n }\n}\n\nexport function isReadableStreamLike(obj: any): obj is ReadableStreamLike {\n // We don't want to use instanceof checks because they would return\n // false for instances from another Realm, like an - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - -
-
-
- - - - - - - - - - \ No newline at end of file diff --git a/2.2.2/index.html b/2.2.2/index.html deleted file mode 100644 index 076255c..0000000 --- a/2.2.2/index.html +++ /dev/null @@ -1,727 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - Projet ROK4 - Librairies python - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- - - - - - - - -
- - - - - - - -
- -
- - - - -
-
- - - - - - - - - - - -
-
-
- - - -
-
-
- - - -
-
- - - - -

Librairies ROK4 Python

-

ROK4 Logo

-

Ces librairies facilitent la manipulation d'entités du projet ROK4 comme les Tile Matrix Sets, les pyramides ou encore les couches, ainsi que la manipulation des stockages associés.

-

Installer la librairie

-

Installations système requises :

-
    -
  • debian : apt install python3-rados python3-gdal
  • -
-

Depuis PyPI : pip install rok4

-

Depuis GitHub : pip install https://github.com/rok4/core-python/releases/download/2.2.2/rok4-2.2.2-py3-none-any.whl

-

L'environnement d'exécution doit avoir accès aux librairies système. Dans le cas d'une utilisation au sein d'un environnement python, précisez bien à la création python3 -m venv --system-site-packages .venv.

-

Utiliser la librairie

-
from rok4.tile_matrix_set import TileMatrixSet
-
-try:
-    tms = TileMatrixSet("file:///path/to/tms.json")
-except Exception as exc:
-    print(exc)
-
-

Les variables d'environnement suivantes peuvent être nécessaires, par module :

-
    -
  • storage : plus de détails dans la documentation technique du module
      -
    • ROK4_READING_LRU_CACHE_SIZE : Nombre d'élément dans le cache de lecture (0 pour ne pas avoir de limite)
    • -
    • ROK4_READING_LRU_CACHE_TTL : Durée de validité d'un élément du cache, en seconde (0 pour ne pas avoir de limite)
    • -
    • ROK4_CEPH_CONFFILE : Fichier de configuration du cluster Ceph
    • -
    • ROK4_CEPH_USERNAME : Compte d'accès au cluster Ceph
    • -
    • ROK4_CEPH_CLUSTERNAME : Nom du cluster Ceph
    • -
    • ROK4_S3_KEY : Clé(s) de(s) serveur(s) S3
    • -
    • ROK4_S3_SECRETKEY : Clé(s) secrète(s) de(s) serveur(s) S3
    • -
    • ROK4_S3_URL : URL de(s) serveur(s) S3
    • -
    • ROK4_SSL_NO_VERIFY : Désactivation de la vérification SSL pour les accès S3 (n'importe quelle valeur non vide)
    • -
    -
  • -
  • tile_matrix_set :
      -
    • ROK4_TMS_DIRECTORY : Dossier racine (fichier ou objet) des tile matrix sets
    • -
    -
  • -
  • style :
      -
    • ROK4_STYLES_DIRECTORY : Dossier racine (fichier ou objet) des styles
    • -
    -
  • -
-

Readings uses a LRU cache system with a TTL. It's possible to configure it with environment variables : -- ROK4_READING_LRU_CACHE_SIZE : Number of cached element. Default 64. Set 0 or a negative integer to configure a cache without bound. A power of two make cache more efficient. -- ROK4_READING_LRU_CACHE_TTL : Validity duration of cached element, in seconds. Default 300. 0 or negative integer to get cache without expiration date.

-

To disable cache (always read data on storage), set ROK4_READING_LRU_CACHE_SIZE to 1 and ROK4_READING_LRU_CACHE_TTL to 1.

-

Using CEPH storage requires environment variables :

-

Using S3 storage requires environment variables :

-

Plus d'exemple dans la documentation développeur.

-

Contribuer

-
    -
  • -

    Installer les dépendances de développement :

    -

    sh -python3 -m pip install -e .[dev] -pre-commit install

    -
  • -
  • -

    Consulter les directives de contribution

    -
  • -
-

Compiler la librairie

-
apt install python3-venv python3-rados python3-gdal
-python3 -m venv .venv
-source .venv/bin/activate
-python3 -m pip install --upgrade build bump2version
-bump2version --current-version 0.0.0 --new-version 2.2.2 patch
-
-# Run unit tests
-python3 -m pip install -e .[test]
-# To use system installed modules rados and osgeo
-echo "/usr/lib/python3/dist-packages/" >.venv/lib/python3.10/site-packages/system.pth
-python3 -c 'import sys; print (sys.path)'
-# Run tests
-coverage run -m pytest
-# Get tests report and generate site
-coverage report -m
-coverage html -d dist/tests/
-
-# Build documentation
-python3 -m pip install -e .[doc]
-pdoc3 --html --output-dir dist/ rok4
-
-# Build artefacts
-python3 -m build
-
-

Remarque :

-

Lors de l'installation du paquet apt python3-gdal, une dépendance, peut demander des interactions de configuration. Pour installer dans un environnement non-interactif, définir la variable shell DEBIAN_FRONTEND=noninteractive permet d'adopter une configuration par défaut.

-

Publier la librairie sur Pypi

-

Configurer le fichier $HOME/.pypirc avec les accès à votre compte PyPI.

-
python3 -m pip install --upgrade twine
-python3 -m twine upload --repository pypi dist/rok4-2.2.2-py3-none-any.whl dist/rok4-2.2.2.tar.gz
-
- - - - - - - - - - - - - - - - - - -
-
- - - -
- -
- - - -
-
-
-
- - - - - - - - - - \ No newline at end of file diff --git a/2.2.2/rok4/enums.html b/2.2.2/rok4/enums.html deleted file mode 100644 index 4314102..0000000 --- a/2.2.2/rok4/enums.html +++ /dev/null @@ -1,249 +0,0 @@ - - - - - - -rok4.enums API documentation - - - - - - - - - - - -
-
-
-

Module rok4.enums

-
-
-
-
-
-
-
-
-
-
-

Classes

-
-
-class ColorFormat -(value, names=None, *, module=None, qualname=None, type=None, start=1) -
-
-

A color format enumeration. -Except from "BIT", the member's name matches -a common variable format name. The member's value is -the allocated bit size associated to this format.

-
- -Expand source code - -
class ColorFormat(Enum):
-    """A color format enumeration.
-    Except from "BIT", the member's name matches
-      a common variable format name. The member's value is
-      the allocated bit size associated to this format.
-    """
-
-    BIT = 1
-    UINT8 = 8
-    FLOAT32 = 32
-
-

Ancestors

-
    -
  • enum.Enum
  • -
-

Class variables

-
-
var BIT
-
-
-
-
var FLOAT32
-
-
-
-
var UINT8
-
-
-
-
-
-
-class PyramidType -(value, names=None, *, module=None, qualname=None, type=None, start=1) -
-
-

Pyramid's data type

-
- -Expand source code - -
class PyramidType(Enum):
-    """Pyramid's data type"""
-
-    RASTER = "RASTER"
-    VECTOR = "VECTOR"
-
-

Ancestors

-
    -
  • enum.Enum
  • -
-

Class variables

-
-
var RASTER
-
-
-
-
var VECTOR
-
-
-
-
-
-
-class SlabType -(value, names=None, *, module=None, qualname=None, type=None, start=1) -
-
-

Slab's type

-
- -Expand source code - -
class SlabType(Enum):
-    """Slab's type"""
-
-    DATA = "DATA"  # Slab of data, raster or vector
-    MASK = "MASK"  # Slab of mask, only for raster pyramid, image with one band : 0 is nodata, other values are data
-
-

Ancestors

-
    -
  • enum.Enum
  • -
-

Class variables

-
-
var DATA
-
-
-
-
var MASK
-
-
-
-
-
-
-class StorageType -(value, names=None, *, module=None, qualname=None, type=None, start=1) -
-
-

Storage type and path's protocol

-
- -Expand source code - -
class StorageType(Enum):
-    """Storage type and path's protocol"""
-
-    CEPH = "ceph://"
-    FILE = "file://"
-    HTTP = "http://"
-    HTTPS = "https://"
-    S3 = "s3://"
-
-

Ancestors

-
    -
  • enum.Enum
  • -
-

Class variables

-
-
var CEPH
-
-
-
-
var FILE
-
-
-
-
var HTTP
-
-
-
-
var HTTPS
-
-
-
-
var S3
-
-
-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/exceptions.html b/2.2.2/rok4/exceptions.html deleted file mode 100644 index e41fa75..0000000 --- a/2.2.2/rok4/exceptions.html +++ /dev/null @@ -1,178 +0,0 @@ - - - - - - -rok4.exceptions API documentation - - - - - - - - - - - -
-
-
-

Module rok4.exceptions

-
-
-
-
-
-
-
-
-
-
-

Classes

-
-
-class FormatError -(expected_format, content, issue) -
-
-

Exception raised when a format is expected but not respected

-
- -Expand source code - -
class FormatError(Exception):
-    """
-    Exception raised when a format is expected but not respected
-    """
-
-    def __init__(self, expected_format, content, issue):
-        self.expected_format = expected_format
-        self.content = content
-        self.issue = issue
-        super().__init__(f"Expected format {expected_format} to read '{content}' : {issue}")
-
-

Ancestors

-
    -
  • builtins.Exception
  • -
  • builtins.BaseException
  • -
-
-
-class MissingAttributeError -(path, missing) -
-
-

Exception raised when an attribute is missing in a file

-
- -Expand source code - -
class MissingAttributeError(Exception):
-    """
-    Exception raised when an attribute is missing in a file
-    """
-
-    def __init__(self, path, missing):
-        self.path = path
-        self.missing = missing
-        super().__init__(f"Missing attribute {missing} in '{path}'")
-
-

Ancestors

-
    -
  • builtins.Exception
  • -
  • builtins.BaseException
  • -
-
-
-class MissingEnvironmentError -(missing) -
-
-

Exception raised when a needed environment variable is not defined

-
- -Expand source code - -
class MissingEnvironmentError(Exception):
-    """
-    Exception raised when a needed environment variable is not defined
-    """
-
-    def __init__(self, missing):
-        self.missing = missing
-        super().__init__(f"Missing environment variable {missing}")
-
-

Ancestors

-
    -
  • builtins.Exception
  • -
  • builtins.BaseException
  • -
-
-
-class StorageError -(type, issue) -
-
-

Exception raised when an issue occured when using a storage

-
- -Expand source code - -
class StorageError(Exception):
-    """
-    Exception raised when an issue occured when using a storage
-    """
-
-    def __init__(self, type, issue):
-        self.type = type
-        self.issue = issue
-        super().__init__(f"Issue occured using a {type} storage : {issue}")
-
-

Ancestors

-
    -
  • builtins.Exception
  • -
  • builtins.BaseException
  • -
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/index.html b/2.2.2/rok4/index.html deleted file mode 100644 index a98e7c0..0000000 --- a/2.2.2/rok4/index.html +++ /dev/null @@ -1,107 +0,0 @@ - - - - - - -rok4 API documentation - - - - - - - - - - - -
-
-
-

Package rok4

-
-
-
-
-

Sub-modules

-
-
rok4.enums
-
-
-
-
rok4.exceptions
-
-
-
-
rok4.layer
-
-

Provide classes to use a layer …

-
-
rok4.pyramid
-
-

Provide classes to use pyramid's data …

-
-
rok4.raster
-
-

Provide functions to read information on raster data …

-
-
rok4.storage
-
-

Provide functions to read or write data …

-
-
rok4.style
-
-

Provide classes to use a ROK4 style …

-
-
rok4.tile_matrix_set
-
-

Provide classes to use a tile matrix set …

-
-
rok4.utils
-
-

Provide functions to manipulate OGR / OSR entities

-
-
rok4.vector
-
-

Provide class to read informations on vector data from file path or object path …

-
-
-
-
-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/layer.html b/2.2.2/rok4/layer.html deleted file mode 100644 index 872474a..0000000 --- a/2.2.2/rok4/layer.html +++ /dev/null @@ -1,576 +0,0 @@ - - - - - - -rok4.layer API documentation - - - - - - - - - - - -
-
-
-

Module rok4.layer

-
-
-

Provide classes to use a layer.

-

The module contains the following classe:

-
    -
  • Layer - Descriptor to broadcast pyramids' data
  • -
-
-
-
-
-
-
-
-
-

Classes

-
-
-class Layer -
-
-

A data layer, raster or vector

-

Attributes

-
-
__name : str
-
layer's technical name
-
__pyramids : Dict[str, Union[Pyramid,str,str]]
-
used pyramids
-
__format : str
-
pyramid's list path
-
__tms : TileMatrixSet
-
Used grid
-
__keywords : List[str]
-
Keywords
-
__levels : Dict[str, Level]
-
Used pyramids' levels
-
__best_level : Level
-
Used pyramids best level
-
__resampling : str
-
Interpolation to use fot resampling
-
__bbox : Tuple[float, float, float, float]
-
data bounding box, TMS coordinates system
-
__geobbox : Tuple[float, float, float, float]
-
data bounding box, EPSG:4326
-
-
- -Expand source code - -
class Layer:
-    """A data layer, raster or vector
-
-    Attributes:
-        __name (str): layer's technical name
-        __pyramids (Dict[str, Union[rok4.pyramid.Pyramid,str,str]]): used pyramids
-        __format (str): pyramid's list path
-        __tms (rok4.tile_matrix_set.TileMatrixSet): Used grid
-        __keywords (List[str]): Keywords
-        __levels (Dict[str, rok4.pyramid.Level]): Used pyramids' levels
-        __best_level (rok4.pyramid.Level): Used pyramids best level
-        __resampling (str): Interpolation to use fot resampling
-        __bbox (Tuple[float, float, float, float]): data bounding box, TMS coordinates system
-        __geobbox (Tuple[float, float, float, float]): data bounding box, EPSG:4326
-    """
-
-    @classmethod
-    def from_descriptor(cls, descriptor: str) -> "Layer":
-        """Create a layer from its descriptor
-
-        Args:
-            descriptor (str): layer's descriptor path
-
-        Raises:
-            FormatError: Provided path is not a well formed JSON
-            MissingAttributeError: Attribute is missing in the content
-            StorageError: Storage read issue (layer descriptor)
-            MissingEnvironmentError: Missing object storage informations
-
-        Returns:
-            Layer: a Layer instance
-        """
-        try:
-            data = json.loads(get_data_str(descriptor))
-
-        except JSONDecodeError as e:
-            raise FormatError("JSON", descriptor, e)
-
-        layer = cls()
-
-        storage_type, path, root, base_name = get_infos_from_path(descriptor)
-        layer.__name = base_name[:-5]  # on supprime l'extension.json
-
-        try:
-            # Attributs communs
-            layer.__title = data["title"]
-            layer.__abstract = data["abstract"]
-            layer.__load_pyramids(data["pyramids"])
-
-            # Paramètres optionnels
-            if "keywords" in data:
-                for k in data["keywords"]:
-                    layer.__keywords.append(k)
-
-            if layer.type == PyramidType.RASTER:
-                if "resampling" in data:
-                    layer.__resampling = data["resampling"]
-
-                if "styles" in data:
-                    layer.__styles = data["styles"]
-                else:
-                    layer.__styles = ["normal"]
-
-            # Les bbox, native et géographique
-            if "bbox" in data:
-                layer.__geobbox = (
-                    data["bbox"]["south"],
-                    data["bbox"]["west"],
-                    data["bbox"]["north"],
-                    data["bbox"]["east"],
-                )
-                layer.__bbox = reproject_bbox(layer.__geobbox, "EPSG:4326", layer.__tms.srs, 5)
-                # On force l'emprise de la couche, on recalcule donc les tuiles limites correspondantes pour chaque niveau
-                for level in layer.__levels.values():
-                    level.set_limits_from_bbox(layer.__bbox)
-            else:
-                layer.__bbox = layer.__best_level.bbox
-                layer.__geobbox = reproject_bbox(layer.__bbox, layer.__tms.srs, "EPSG:4326", 5)
-
-        except KeyError as e:
-            raise MissingAttributeError(descriptor, e)
-
-        return layer
-
-    @classmethod
-    def from_parameters(cls, pyramids: List[Dict[str, str]], name: str, **kwargs) -> "Layer":
-        """Create a default layer from parameters
-
-        Args:
-            pyramids (List[Dict[str, str]]): pyramids to use and extrem levels, bottom and top
-            name (str): layer's technical name
-            **title (str): Layer's title (will be equal to name if not provided)
-            **abstract (str): Layer's abstract (will be equal to name if not provided)
-            **styles (List[str]): Styles identifier to authorized for the layer
-            **resampling (str): Interpolation to use for resampling
-
-        Raises:
-            Exception: name contains forbidden characters or used pyramids do not shared same parameters (format, tms...)
-
-        Returns:
-            Layer: a Layer instance
-        """
-
-        layer = cls()
-
-        # Informations obligatoires
-        if not re.match("^[A-Za-z0-9_-]*$", name):
-            raise Exception(
-                f"Layer's name have to contain only letters, number, hyphen and underscore, to be URL and storage compliant ({name})"
-            )
-
-        layer.__name = name
-        layer.__load_pyramids(pyramids)
-
-        # Les bbox, native et géographique
-        layer.__bbox = layer.__best_level.bbox
-        layer.__geobbox = reproject_bbox(layer.__bbox, layer.__tms.srs, "EPSG:4326", 5)
-
-        # Informations calculées
-        layer.__keywords.append(layer.type.name)
-        layer.__keywords.append(layer.__name)
-
-        # Informations optionnelles
-        if "title" in kwargs and kwargs["title"] is not None:
-            layer.__title = kwargs["title"]
-        else:
-            layer.__title = name
-
-        if "abstract" in kwargs and kwargs["abstract"] is not None:
-            layer.__abstract = kwargs["abstract"]
-        else:
-            layer.__abstract = name
-
-        if layer.type == PyramidType.RASTER:
-            if "styles" in kwargs and kwargs["styles"] is not None and len(kwargs["styles"]) > 0:
-                layer.__styles = kwargs["styles"]
-            else:
-                layer.__styles = ["normal"]
-
-            if "resampling" in kwargs and kwargs["resampling"] is not None:
-                layer.__resampling = kwargs["resampling"]
-
-        return layer
-
-    def __init__(self) -> None:
-        self.__format = None
-        self.__tms = None
-        self.__best_level = None
-        self.__levels = {}
-        self.__keywords = []
-        self.__pyramids = []
-
-    def __load_pyramids(self, pyramids: List[Dict[str, str]]) -> None:
-        """Load and check pyramids
-
-        Args:
-            pyramids (List[Dict[str, str]]): List of descriptors' paths and optionnaly top and bottom levels
-
-        Raises:
-            Exception: Pyramids' do not all own the same format
-            Exception: Pyramids' do not all own the same TMS
-            Exception: Pyramids' do not all own the same channels number
-            Exception: Overlapping in usage pyramids' levels
-        """
-
-        # Toutes les pyramides doivent avoir les même caractéristiques
-        channels = None
-        for p in pyramids:
-            pyramid = Pyramid.from_descriptor(p["path"])
-            bottom_level = p.get("bottom_level", None)
-            top_level = p.get("top_level", None)
-
-            if bottom_level is None:
-                bottom_level = pyramid.bottom_level.id
-
-            if top_level is None:
-                top_level = pyramid.top_level.id
-
-            if self.__format is not None and self.__format != pyramid.format:
-                raise Exception(
-                    f"Used pyramids have to own the same format : {self.__format} != {pyramid.format}"
-                )
-            else:
-                self.__format = pyramid.format
-
-            if self.__tms is not None and self.__tms.id != pyramid.tms.id:
-                raise Exception(
-                    f"Used pyramids have to use the same TMS : {self.__tms.id} != {pyramid.tms.id}"
-                )
-            else:
-                self.__tms = pyramid.tms
-
-            if self.type == PyramidType.RASTER:
-                if channels is not None and channels != pyramid.raster_specifications["channels"]:
-                    raise Exception(
-                        f"Used RASTER pyramids have to own the same number of channels : {channels} != {pyramid.raster_specifications['channels']}"
-                    )
-                else:
-                    channels = pyramid.raster_specifications["channels"]
-                self.__resampling = pyramid.raster_specifications["interpolation"]
-
-            levels = pyramid.get_levels(bottom_level, top_level)
-            for level in levels:
-                if level.id in self.__levels:
-                    raise Exception(f"Level {level.id} is present in two used pyramids")
-                self.__levels[level.id] = level
-
-            self.__pyramids.append(
-                {"pyramid": pyramid, "bottom_level": bottom_level, "top_level": top_level}
-            )
-
-        self.__best_level = sorted(self.__levels.values(), key=lambda level: level.resolution)[0]
-
-    def __str__(self) -> str:
-        return f"{self.type.name} layer '{self.__name}'"
-
-    @property
-    def serializable(self) -> Dict:
-        """Get the dict version of the layer object, descriptor compliant
-
-        Returns:
-            Dict: descriptor structured object description
-        """
-        serialization = {
-            "title": self.__title,
-            "abstract": self.__abstract,
-            "keywords": self.__keywords,
-            "wmts": {"authorized": True},
-            "tms": {"authorized": True},
-            "bbox": {
-                "south": self.__geobbox[0],
-                "west": self.__geobbox[1],
-                "north": self.__geobbox[2],
-                "east": self.__geobbox[3],
-            },
-            "pyramids": [],
-        }
-
-        for p in self.__pyramids:
-            serialization["pyramids"].append(
-                {
-                    "bottom_level": p["bottom_level"],
-                    "top_level": p["top_level"],
-                    "path": p["pyramid"].descriptor,
-                }
-            )
-
-        if self.type == PyramidType.RASTER:
-            serialization["wms"] = {
-                "authorized": True,
-                "crs": ["CRS:84", "IGNF:WGS84G", "EPSG:3857", "EPSG:4258", "EPSG:4326"],
-            }
-
-            if self.__tms.srs.upper() not in serialization["wms"]["crs"]:
-                serialization["wms"]["crs"].append(self.__tms.srs.upper())
-
-            serialization["styles"] = self.__styles
-            serialization["resampling"] = self.__resampling
-
-        return serialization
-
-    def write_descriptor(self, directory: str = None) -> None:
-        """Print layer's descriptor as JSON
-
-        Args:
-            directory (str, optional): Directory (file or object) where to print the layer's descriptor, called <layer's name>.json. Defaults to None, JSON is printed to standard output.
-        """
-        content = json.dumps(self.serializable)
-
-        if directory is None:
-            print(content)
-        else:
-            put_data_str(content, os.path.join(directory, f"{self.__name}.json"))
-
-    @property
-    def type(self) -> PyramidType:
-        if self.__format == "TIFF_PBF_MVT":
-            return PyramidType.VECTOR
-        else:
-            return PyramidType.RASTER
-
-    @property
-    def bbox(self) -> Tuple[float, float, float, float]:
-        return self.__bbox
-
-    @property
-    def geobbox(self) -> Tuple[float, float, float, float]:
-        return self.__geobbox
-
-

Static methods

-
-
-def from_descriptor(descriptor: str) ‑> Layer -
-
-

Create a layer from its descriptor

-

Args

-
-
descriptor : str
-
layer's descriptor path
-
-

Raises

-
-
FormatError
-
Provided path is not a well formed JSON
-
MissingAttributeError
-
Attribute is missing in the content
-
StorageError
-
Storage read issue (layer descriptor)
-
MissingEnvironmentError
-
Missing object storage informations
-
-

Returns

-
-
Layer
-
a Layer instance
-
-
-
-def from_parameters(pyramids: List[Dict[str, str]], name: str, **kwargs) ‑> Layer -
-
-

Create a default layer from parameters

-

Args

-
-
pyramids : List[Dict[str, str]]
-
pyramids to use and extrem levels, bottom and top
-
name : str
-
layer's technical name
-
**title : str
-
Layer's title (will be equal to name if not provided)
-
**abstract : str
-
Layer's abstract (will be equal to name if not provided)
-
**styles : List[str]
-
Styles identifier to authorized for the layer
-
**resampling : str
-
Interpolation to use for resampling
-
-

Raises

-
-
Exception
-
name contains forbidden characters or used pyramids do not shared same parameters (format, tms…)
-
-

Returns

-
-
Layer
-
a Layer instance
-
-
-
-

Instance variables

-
-
prop bbox : Tuple[float, float, float, float]
-
-
-
- -Expand source code - -
@property
-def bbox(self) -> Tuple[float, float, float, float]:
-    return self.__bbox
-
-
-
prop geobbox : Tuple[float, float, float, float]
-
-
-
- -Expand source code - -
@property
-def geobbox(self) -> Tuple[float, float, float, float]:
-    return self.__geobbox
-
-
-
prop serializable : Dict[~KT, ~VT]
-
-

Get the dict version of the layer object, descriptor compliant

-

Returns

-
-
Dict
-
descriptor structured object description
-
-
- -Expand source code - -
@property
-def serializable(self) -> Dict:
-    """Get the dict version of the layer object, descriptor compliant
-
-    Returns:
-        Dict: descriptor structured object description
-    """
-    serialization = {
-        "title": self.__title,
-        "abstract": self.__abstract,
-        "keywords": self.__keywords,
-        "wmts": {"authorized": True},
-        "tms": {"authorized": True},
-        "bbox": {
-            "south": self.__geobbox[0],
-            "west": self.__geobbox[1],
-            "north": self.__geobbox[2],
-            "east": self.__geobbox[3],
-        },
-        "pyramids": [],
-    }
-
-    for p in self.__pyramids:
-        serialization["pyramids"].append(
-            {
-                "bottom_level": p["bottom_level"],
-                "top_level": p["top_level"],
-                "path": p["pyramid"].descriptor,
-            }
-        )
-
-    if self.type == PyramidType.RASTER:
-        serialization["wms"] = {
-            "authorized": True,
-            "crs": ["CRS:84", "IGNF:WGS84G", "EPSG:3857", "EPSG:4258", "EPSG:4326"],
-        }
-
-        if self.__tms.srs.upper() not in serialization["wms"]["crs"]:
-            serialization["wms"]["crs"].append(self.__tms.srs.upper())
-
-        serialization["styles"] = self.__styles
-        serialization["resampling"] = self.__resampling
-
-    return serialization
-
-
-
prop typePyramidType
-
-
-
- -Expand source code - -
@property
-def type(self) -> PyramidType:
-    if self.__format == "TIFF_PBF_MVT":
-        return PyramidType.VECTOR
-    else:
-        return PyramidType.RASTER
-
-
-
-

Methods

-
-
-def write_descriptor(self, directory: str = None) ‑> None -
-
-

Print layer's descriptor as JSON

-

Args

-
-
directory : str, optional
-
Directory (file or object) where to print the layer's descriptor, called .json. Defaults to None, JSON is printed to standard output.
-
-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/pyramid.html b/2.2.2/rok4/pyramid.html deleted file mode 100644 index d98dfa1..0000000 --- a/2.2.2/rok4/pyramid.html +++ /dev/null @@ -1,2773 +0,0 @@ - - - - - - -rok4.pyramid API documentation - - - - - - - - - - - -
-
-
-

Module rok4.pyramid

-
-
-

Provide classes to use pyramid's data.

-

The module contains the following classes:

- -
-
-
-
-

Global variables

-
-
var ROK4_IMAGE_HEADER_SIZE
-
-

Slab's header size, 2048 bytes

-
-
-
-
-

Functions

-
-
-def b36_number_decode(number: str) ‑> int -
-
-

Convert base-36 number to base-10

-

Args

-
-
number : str
-
base-36 number
-
-

Returns

-
-
int
-
base-10 number
-
-
-
-def b36_number_encode(number: int) ‑> str -
-
-

Convert base-10 number to base-36

-

Used alphabet is '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ'

-

Args

-
-
number : int
-
base-10 number
-
-

Returns

-
-
str
-
base-36 number
-
-
-
-def b36_path_decode(path: str) ‑> Tuple[int, int] -
-
-

Get slab's column and row from a base-36 based path

-

Args

-
-
path : str
-
slab's path
-
-

Returns

-
-
Tuple[int, int]
-
slab's column and row
-
-
-
-def b36_path_encode(column: int, row: int, slashs: int) ‑> str -
-
-

Convert slab indices to base-36 based path, with .tif extension

-

Args

-
-
column : int
-
slab's column
-
row : int
-
slab's row
-
slashs : int
-
slashs' number (to split path)
-
-

Returns

-
-
str
-
base-36 based path
-
-
-
-
-
-

Classes

-
-
-class Level -
-
-

A pyramid's level, raster or vector

-

Attributes

-
-
__id : str
-
level's identifier. have to exist in the pyramid's used TMS
-
__tile_limits : Dict[str, int]
-
minimum and maximum tiles' columns and rows of pyramid's content
-
__slab_size : Tuple[int, int]
-
number of tile in a slab, widthwise and heightwise
-
__tables : List[Dict]
-
for a VECTOR pyramid, description of vector content, tables and attributes
-
-
- -Expand source code - -
class Level:
-    """A pyramid's level, raster or vector
-
-    Attributes:
-        __id (str): level's identifier. have to exist in the pyramid's used TMS
-        __tile_limits (Dict[str, int]): minimum and maximum tiles' columns and rows of pyramid's content
-        __slab_size (Tuple[int, int]): number of tile in a slab, widthwise and heightwise
-        __tables (List[Dict]): for a VECTOR pyramid, description of vector content, tables and attributes
-    """
-
-    @classmethod
-    def from_descriptor(cls, data: Dict, pyramid: "Pyramid") -> "Level":
-        """Create a pyramid's level from the pyramid's descriptor levels element
-
-        Args:
-            data (Dict): level's information from the pyramid's descriptor
-            pyramid (Pyramid): pyramid containing the level to create
-
-        Raises:
-            Exception: different storage or masks presence between the level and the pyramid
-            MissingAttributeError: Attribute is missing in the content
-
-        Returns:
-            Pyramid: a Level instance
-        """
-        level = cls()
-
-        level.__pyramid = pyramid
-
-        # Attributs communs
-        try:
-            level.__id = data["id"]
-            level.__tile_limits = data["tile_limits"]
-            level.__slab_size = (
-                data["tiles_per_width"],
-                data["tiles_per_height"],
-            )
-
-            # Informations sur le stockage : on les valide et stocke dans la pyramide
-            if pyramid.storage_type.name != data["storage"]["type"]:
-                raise Exception(
-                    f"Pyramid {pyramid.descriptor} owns levels using different storage types ({ data['storage']['type'] }) than its one ({pyramid.storage_type.name})"
-                )
-
-            if pyramid.storage_type == StorageType.FILE:
-                pyramid.storage_depth = data["storage"]["path_depth"]
-
-            if "mask_directory" in data["storage"] or "mask_prefix" in data["storage"]:
-                if not pyramid.own_masks:
-                    raise Exception(
-                        f"Pyramid {pyramid.__descriptor} does not define a mask format but level {level.__id} define mask storage informations"
-                    )
-            else:
-                if pyramid.own_masks:
-                    raise Exception(
-                        f"Pyramid {pyramid.__descriptor} define a mask format but level {level.__id} does not define mask storage informations"
-                    )
-
-        except KeyError as e:
-            raise MissingAttributeError(pyramid.descriptor, f"levels[].{e}")
-
-        # Attributs dans le cas d'un niveau vecteur
-        if level.__pyramid.type == PyramidType.VECTOR:
-            try:
-                level.__tables = data["tables"]
-
-            except KeyError as e:
-                raise MissingAttributeError(pyramid.descriptor, f"levels[].{e}")
-
-        return level
-
-    @classmethod
-    def from_other(cls, other: "Level", pyramid: "Pyramid") -> "Level":
-        """Create a pyramid's level from another one
-
-        Args:
-            other (Level): level to clone
-            pyramid (Pyramid): new pyramid containing the new level
-
-        Raises:
-            Exception: different storage or masks presence between the level and the pyramid
-            MissingAttributeError: Attribute is missing in the content
-
-        Returns:
-            Pyramid: a Level instance
-        """
-
-        level = cls()
-
-        # Attributs communs
-        level.__id = other.__id
-        level.__pyramid = pyramid
-        level.__tile_limits = other.__tile_limits
-        level.__slab_size = other.__slab_size
-
-        # Attributs dans le cas d'un niveau vecteur
-        if level.__pyramid.type == PyramidType.VECTOR:
-            level.__tables = other.__tables
-
-        return level
-
-    def __str__(self) -> str:
-        return f"{self.__pyramid.type.name} pyramid's level '{self.__id}' ({self.__pyramid.storage_type.name} storage)"
-
-    @property
-    def serializable(self) -> Dict:
-        """Get the dict version of the pyramid object, pyramid's descriptor compliant
-
-        Returns:
-            Dict: pyramid's descriptor structured object description
-        """
-        serialization = {
-            "id": self.__id,
-            "tiles_per_width": self.__slab_size[0],
-            "tiles_per_height": self.__slab_size[1],
-            "tile_limits": self.__tile_limits,
-        }
-
-        if self.__pyramid.type == PyramidType.VECTOR:
-            serialization["tables"] = self.__tables
-
-        if self.__pyramid.storage_type == StorageType.FILE:
-            serialization["storage"] = {
-                "type": "FILE",
-                "image_directory": f"{self.__pyramid.name}/DATA/{self.__id}",
-                "path_depth": self.__pyramid.storage_depth,
-            }
-            if self.__pyramid.own_masks:
-                serialization["storage"][
-                    "mask_directory"
-                ] = f"{self.__pyramid.name}/MASK/{self.__id}"
-
-        elif self.__pyramid.storage_type == StorageType.CEPH:
-            serialization["storage"] = {
-                "type": "CEPH",
-                "image_prefix": f"{self.__pyramid.name}/DATA_{self.__id}",
-                "pool_name": self.__pyramid.storage_root,
-            }
-            if self.__pyramid.own_masks:
-                serialization["storage"]["mask_prefix"] = f"{self.__pyramid.name}/MASK_{self.__id}"
-
-        elif self.__pyramid.storage_type == StorageType.S3:
-            serialization["storage"] = {
-                "type": "S3",
-                "image_prefix": f"{self.__pyramid.name}/DATA_{self.__id}",
-                "bucket_name": self.__pyramid.storage_root,
-            }
-            if self.__pyramid.own_masks:
-                serialization["storage"]["mask_prefix"] = f"{self.__pyramid.name}/MASK_{self.__id}"
-
-        return serialization
-
-    @property
-    def id(self) -> str:
-        return self.__id
-
-    @property
-    def bbox(self) -> Tuple[float, float, float, float]:
-        """Return level extent, based on tile limits
-
-        Returns:
-            Tuple[float, float, float, float]: level terrain extent (xmin, ymin, xmax, ymax)
-        """
-
-        min_bbox = self.__pyramid.tms.get_level(self.__id).tile_to_bbox(
-            self.__tile_limits["min_col"], self.__tile_limits["max_row"]
-        )
-        max_bbox = self.__pyramid.tms.get_level(self.__id).tile_to_bbox(
-            self.__tile_limits["max_col"], self.__tile_limits["min_row"]
-        )
-
-        return (min_bbox[0], min_bbox[1], max_bbox[2], max_bbox[3])
-
-    @property
-    def resolution(self) -> str:
-        return self.__pyramid.tms.get_level(self.__id).resolution
-
-    @property
-    def tile_matrix(self) -> TileMatrix:
-        return self.__pyramid.tms.get_level(self.__id)
-
-    @property
-    def slab_width(self) -> int:
-        return self.__slab_size[0]
-
-    @property
-    def slab_height(self) -> int:
-        return self.__slab_size[1]
-
-    @property
-    def tile_limits(self) -> Dict[str, int]:
-        return self.__tile_limits
-
-    def is_in_limits(self, column: int, row: int) -> bool:
-        """Is the tile indices in limits ?
-
-        Args:
-            column (int): tile's column
-            row (int): tile's row
-
-        Returns:
-            bool: True if tiles' limits contain the provided tile's indices
-        """
-        return (
-            self.__tile_limits["min_row"] <= row
-            and self.__tile_limits["max_row"] >= row
-            and self.__tile_limits["min_col"] <= column
-            and self.__tile_limits["max_col"] >= column
-        )
-
-    def set_limits_from_bbox(self, bbox: Tuple[float, float, float, float]) -> None:
-        """Set tile limits, based on provided bounding box
-
-        Args:
-            bbox (Tuple[float, float, float, float]): terrain extent (xmin, ymin, xmax, ymax), in TMS coordinates system
-
-        """
-
-        col_min, row_min, col_max, row_max = self.__pyramid.tms.get_level(self.__id).bbox_to_tiles(
-            bbox
-        )
-        self.__tile_limits = {
-            "min_row": row_min,
-            "max_col": col_max,
-            "max_row": row_max,
-            "min_col": col_min,
-        }
-
-

Static methods

-
-
-def from_descriptor(data: Dict[~KT, ~VT], pyramid: Pyramid) ‑> Level -
-
-

Create a pyramid's level from the pyramid's descriptor levels element

-

Args

-
-
data : Dict
-
level's information from the pyramid's descriptor
-
pyramid : Pyramid
-
pyramid containing the level to create
-
-

Raises

-
-
Exception
-
different storage or masks presence between the level and the pyramid
-
MissingAttributeError
-
Attribute is missing in the content
-
-

Returns

-
-
Pyramid
-
a Level instance
-
-
-
-def from_other(other: Level, pyramid: Pyramid) ‑> Level -
-
-

Create a pyramid's level from another one

-

Args

-
-
other : Level
-
level to clone
-
pyramid : Pyramid
-
new pyramid containing the new level
-
-

Raises

-
-
Exception
-
different storage or masks presence between the level and the pyramid
-
MissingAttributeError
-
Attribute is missing in the content
-
-

Returns

-
-
Pyramid
-
a Level instance
-
-
-
-

Instance variables

-
-
prop bbox : Tuple[float, float, float, float]
-
-

Return level extent, based on tile limits

-

Returns

-
-
Tuple[float, float, float, float]
-
level terrain extent (xmin, ymin, xmax, ymax)
-
-
- -Expand source code - -
@property
-def bbox(self) -> Tuple[float, float, float, float]:
-    """Return level extent, based on tile limits
-
-    Returns:
-        Tuple[float, float, float, float]: level terrain extent (xmin, ymin, xmax, ymax)
-    """
-
-    min_bbox = self.__pyramid.tms.get_level(self.__id).tile_to_bbox(
-        self.__tile_limits["min_col"], self.__tile_limits["max_row"]
-    )
-    max_bbox = self.__pyramid.tms.get_level(self.__id).tile_to_bbox(
-        self.__tile_limits["max_col"], self.__tile_limits["min_row"]
-    )
-
-    return (min_bbox[0], min_bbox[1], max_bbox[2], max_bbox[3])
-
-
-
prop id : str
-
-
-
- -Expand source code - -
@property
-def id(self) -> str:
-    return self.__id
-
-
-
prop resolution : str
-
-
-
- -Expand source code - -
@property
-def resolution(self) -> str:
-    return self.__pyramid.tms.get_level(self.__id).resolution
-
-
-
prop serializable : Dict[~KT, ~VT]
-
-

Get the dict version of the pyramid object, pyramid's descriptor compliant

-

Returns

-
-
Dict
-
pyramid's descriptor structured object description
-
-
- -Expand source code - -
@property
-def serializable(self) -> Dict:
-    """Get the dict version of the pyramid object, pyramid's descriptor compliant
-
-    Returns:
-        Dict: pyramid's descriptor structured object description
-    """
-    serialization = {
-        "id": self.__id,
-        "tiles_per_width": self.__slab_size[0],
-        "tiles_per_height": self.__slab_size[1],
-        "tile_limits": self.__tile_limits,
-    }
-
-    if self.__pyramid.type == PyramidType.VECTOR:
-        serialization["tables"] = self.__tables
-
-    if self.__pyramid.storage_type == StorageType.FILE:
-        serialization["storage"] = {
-            "type": "FILE",
-            "image_directory": f"{self.__pyramid.name}/DATA/{self.__id}",
-            "path_depth": self.__pyramid.storage_depth,
-        }
-        if self.__pyramid.own_masks:
-            serialization["storage"][
-                "mask_directory"
-            ] = f"{self.__pyramid.name}/MASK/{self.__id}"
-
-    elif self.__pyramid.storage_type == StorageType.CEPH:
-        serialization["storage"] = {
-            "type": "CEPH",
-            "image_prefix": f"{self.__pyramid.name}/DATA_{self.__id}",
-            "pool_name": self.__pyramid.storage_root,
-        }
-        if self.__pyramid.own_masks:
-            serialization["storage"]["mask_prefix"] = f"{self.__pyramid.name}/MASK_{self.__id}"
-
-    elif self.__pyramid.storage_type == StorageType.S3:
-        serialization["storage"] = {
-            "type": "S3",
-            "image_prefix": f"{self.__pyramid.name}/DATA_{self.__id}",
-            "bucket_name": self.__pyramid.storage_root,
-        }
-        if self.__pyramid.own_masks:
-            serialization["storage"]["mask_prefix"] = f"{self.__pyramid.name}/MASK_{self.__id}"
-
-    return serialization
-
-
-
prop slab_height : int
-
-
-
- -Expand source code - -
@property
-def slab_height(self) -> int:
-    return self.__slab_size[1]
-
-
-
prop slab_width : int
-
-
-
- -Expand source code - -
@property
-def slab_width(self) -> int:
-    return self.__slab_size[0]
-
-
-
prop tile_limits : Dict[str, int]
-
-
-
- -Expand source code - -
@property
-def tile_limits(self) -> Dict[str, int]:
-    return self.__tile_limits
-
-
-
prop tile_matrixTileMatrix
-
-
-
- -Expand source code - -
@property
-def tile_matrix(self) -> TileMatrix:
-    return self.__pyramid.tms.get_level(self.__id)
-
-
-
-

Methods

-
-
-def is_in_limits(self, column: int, row: int) ‑> bool -
-
-

Is the tile indices in limits ?

-

Args

-
-
column : int
-
tile's column
-
row : int
-
tile's row
-
-

Returns

-
-
bool
-
True if tiles' limits contain the provided tile's indices
-
-
-
-def set_limits_from_bbox(self, bbox: Tuple[float, float, float, float]) ‑> None -
-
-

Set tile limits, based on provided bounding box

-

Args

-
-
bbox : Tuple[float, float, float, float]
-
terrain extent (xmin, ymin, xmax, ymax), in TMS coordinates system
-
-
-
-
-
-class Pyramid -
-
-

A data pyramid, raster or vector

-

Attributes

-
-
__name : str
-
pyramid's name
-
__descriptor : str
-
pyramid's descriptor path
-
__list : str
-
pyramid's list path
-
__tms : TileMatrixSet
-
Used grid
-
__levels : Dict[str, Level]
-
Pyramid's levels
-
__format : str
-
Data format
-
__storage : Dict[str, Union[StorageType,str,int]]
-
Pyramid's storage informations (type, root and depth if FILE storage)
-
__raster_specifications : Dict
-
If raster pyramid, raster specifications
-
__content : Dict
-
-

Loading status (loaded), slab count (count) and list content (cache).

-

Example (S3 storage):

-
{
-    'cache': {
-        (<SlabType.DATA: 'DATA'>, '18', 5424, 7526): {
-            'link': False,
-            'md5': None,
-            'root': 'pyramids@localhost:9000/LIMADM',
-            'slab': 'DATA_18_5424_7526'
-        }
-    },
-    'count': 1,
-    'loaded': True
-}
-
-
-
-
- -Expand source code - -
class Pyramid:
-    """A data pyramid, raster or vector
-
-    Attributes:
-        __name (str): pyramid's name
-        __descriptor (str): pyramid's descriptor path
-        __list (str): pyramid's list path
-        __tms (rok4.tile_matrix_set.TileMatrixSet): Used grid
-        __levels (Dict[str, Level]): Pyramid's levels
-        __format (str): Data format
-        __storage (Dict[str, Union[rok4.enums.StorageType,str,int]]): Pyramid's storage informations (type, root and depth if FILE storage)
-        __raster_specifications (Dict): If raster pyramid, raster specifications
-        __content (Dict): Loading status (loaded), slab count (count) and list content (cache).
-
-            Example (S3 storage):
-
-                {
-                    'cache': {
-                        (<SlabType.DATA: 'DATA'>, '18', 5424, 7526): {
-                            'link': False,
-                            'md5': None,
-                            'root': 'pyramids@localhost:9000/LIMADM',
-                            'slab': 'DATA_18_5424_7526'
-                        }
-                    },
-                    'count': 1,
-                    'loaded': True
-                }
-    """
-
-    @classmethod
-    def from_descriptor(cls, descriptor: str) -> "Pyramid":
-        """Create a pyramid from its descriptor
-
-        Args:
-            descriptor (str): pyramid's descriptor path
-
-        Raises:
-            FormatError: Provided path or the descriptor is not a well formed JSON
-            Exception: Level issue : no one in the pyramid or the used TMS, or level ID not defined in the TMS
-            MissingAttributeError: Attribute is missing in the content
-            StorageError: Storage read issue (pyramid descriptor or TMS)
-            MissingEnvironmentError: Missing object storage informations or TMS root directory
-
-        Examples:
-
-            S3 stored descriptor
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-                except Exception as e:
-                    print("Cannot load the pyramid from its descriptor")
-
-        Returns:
-            Pyramid: a Pyramid instance
-        """
-        try:
-            data = json.loads(get_data_str(descriptor))
-
-        except JSONDecodeError as e:
-            raise FormatError("JSON", descriptor, e)
-
-        pyramid = cls()
-
-        pyramid.__storage["type"], path, pyramid.__storage["root"], base_name = get_infos_from_path(
-            descriptor
-        )
-        pyramid.__name = base_name[:-5]  # on supprime l'extension.json
-        pyramid.__descriptor = descriptor
-        pyramid.__list = get_path_from_infos(
-            pyramid.__storage["type"], pyramid.__storage["root"], f"{pyramid.__name}.list"
-        )
-
-        try:
-            # Attributs communs
-            pyramid.__tms = TileMatrixSet(data["tile_matrix_set"])
-            pyramid.__format = data["format"]
-
-            # Attributs d'une pyramide raster
-            if pyramid.type == PyramidType.RASTER:
-                pyramid.__raster_specifications = data["raster_specifications"]
-
-                if "mask_format" in data:
-                    pyramid.__masks = True
-                else:
-                    pyramid.__masks = False
-
-            # Niveaux
-            for level in data["levels"]:
-                lev = Level.from_descriptor(level, pyramid)
-                pyramid.__levels[lev.id] = lev
-
-                if pyramid.__tms.get_level(lev.id) is None:
-                    raise Exception(
-                        f"Pyramid {descriptor} owns a level with the ID '{lev.id}', not defined in the TMS '{pyramid.tms.name}'"
-                    )
-
-        except KeyError as e:
-            raise MissingAttributeError(descriptor, e)
-
-        if len(pyramid.__levels.keys()) == 0:
-            raise Exception(f"Pyramid '{descriptor}' has no level")
-
-        return pyramid
-
-    @classmethod
-    def from_other(cls, other: "Pyramid", name: str, storage: Dict, **kwargs) -> "Pyramid":
-        """Create a pyramid from another one
-
-        Args:
-            other (Pyramid): pyramid to clone
-            name (str): new pyramid's name
-            storage (Dict[str, Union[str, int]]): new pyramid's storage informations
-            **mask (bool) : Presence or not of mask (only for RASTER)
-
-        Raises:
-            FormatError: Provided path or the TMS is not a well formed JSON
-            Exception: Level issue : no one in the pyramid or the used TMS, or level ID not defined in the TMS
-            MissingAttributeError: Attribute is missing in the content
-
-        Returns:
-            Pyramid: a Pyramid instance
-        """
-        try:
-            # On convertit le type de stockage selon l'énumération
-            if type(storage["type"]) is str:
-                storage["type"] = StorageType[storage["type"]]
-
-            if storage["type"] == StorageType.FILE and name.find("/") != -1:
-                raise Exception(f"A FILE stored pyramid's name cannot contain '/' : '{name}'")
-
-            if storage["type"] == StorageType.FILE and "depth" not in storage:
-                storage["depth"] = 2
-
-            pyramid = cls()
-
-            # Attributs communs
-            pyramid.__name = name
-            pyramid.__storage = storage
-            pyramid.__masks = other.__masks
-
-            pyramid.__descriptor = get_path_from_infos(
-                pyramid.__storage["type"], pyramid.__storage["root"], f"{pyramid.__name}.json"
-            )
-            pyramid.__list = get_path_from_infos(
-                pyramid.__storage["type"], pyramid.__storage["root"], f"{pyramid.__name}.list"
-            )
-            pyramid.__tms = other.__tms
-            pyramid.__format = other.__format
-
-            # Attributs d'une pyramide raster
-            if pyramid.type == PyramidType.RASTER:
-                if "mask" in kwargs:
-                    pyramid.__masks = kwargs["mask"]
-                elif other.own_masks:
-                    pyramid.__masks = True
-                else:
-                    pyramid.__masks = False
-                pyramid.__raster_specifications = other.__raster_specifications
-
-            # Niveaux
-            for level in other.__levels.values():
-                lev = Level.from_other(level, pyramid)
-                pyramid.__levels[lev.id] = lev
-
-        except KeyError as e:
-            raise MissingAttributeError(pyramid.descriptor, e)
-
-        return pyramid
-
-    def __init__(self) -> None:
-        self.__storage = {}
-        self.__levels = {}
-        self.__masks = None
-
-        self.__content = {"loaded": False, "count": 0, "cache": {}}
-
-    def __str__(self) -> str:
-        return f"{self.type.name} pyramid '{self.__name}' ({self.__storage['type'].name} storage)"
-
-    @property
-    def serializable(self) -> Dict:
-        """Get the dict version of the pyramid object, descriptor compliant
-
-        Returns:
-            Dict: descriptor structured object description
-        """
-
-        serialization = {"tile_matrix_set": self.__tms.name, "format": self.__format}
-
-        serialization["levels"] = []
-        sorted_levels = sorted(
-            self.__levels.values(), key=lambda level: level.resolution, reverse=True
-        )
-
-        for level in sorted_levels:
-            serialization["levels"].append(level.serializable)
-
-        if self.type == PyramidType.RASTER:
-            serialization["raster_specifications"] = self.__raster_specifications
-
-        if self.__masks:
-            serialization["mask_format"] = "TIFF_ZIP_UINT8"
-
-        return serialization
-
-    @property
-    def list(self) -> str:
-        return self.__list
-
-    @property
-    def descriptor(self) -> str:
-        return self.__descriptor
-
-    @property
-    def name(self) -> str:
-        return self.__name
-
-    @property
-    def tms(self) -> TileMatrixSet:
-        return self.__tms
-
-    @property
-    def raster_specifications(self) -> Dict:
-        """Get raster specifications for a RASTER pyramid
-
-        Example:
-
-            RGB pyramid with red nodata
-
-                {
-                    "channels": 3,
-                    "nodata": "255,0,0",
-                    "photometric": "rgb",
-                    "interpolation": "bicubic"
-                }
-
-        Returns:
-            Dict: Raster specifications, None if VECTOR pyramid
-        """
-        return self.__raster_specifications
-
-    @property
-    def storage_type(self) -> StorageType:
-        """Get the storage type
-
-        Returns:
-            StorageType: FILE, S3 or CEPH
-        """
-        return self.__storage["type"]
-
-    @property
-    def storage_root(self) -> str:
-        """Get the pyramid's storage root.
-
-        If storage is S3, the used cluster is removed.
-
-        Returns:
-            str: Pyramid's storage root
-        """
-
-        return self.__storage["root"].split("@", 1)[
-            0
-        ]  # Suppression de l'éventuel hôte de spécification du cluster S3
-
-    @property
-    def storage_depth(self) -> int:
-        return self.__storage.get("depth", None)
-
-    @property
-    def storage_s3_cluster(self) -> str:
-        """Get the pyramid's storage S3 cluster (host name)
-
-        Returns:
-            str: the host if known, None if the default one have to be used or if storage is not S3
-        """
-        if self.__storage["type"] == StorageType.S3:
-            try:
-                return self.__storage["root"].split("@")[1]
-            except IndexError:
-                return None
-        else:
-            return None
-
-    @storage_depth.setter
-    def storage_depth(self, d: int) -> None:
-        """Set the tree depth for a FILE storage
-
-        Args:
-            d (int): file storage depth
-
-        Raises:
-            Exception: the depth is not equal to the already known depth
-        """
-        if "depth" in self.__storage and self.__storage["depth"] != d:
-            raise Exception(f"Pyramid {self.__descriptor} owns levels with different path depths")
-        self.__storage["depth"] = d
-
-    @property
-    def own_masks(self) -> bool:
-        return self.__masks
-
-    @property
-    def format(self) -> str:
-        return self.__format
-
-    @property
-    def channels(self) -> str:
-        return self.raster_specifications["channels"]
-
-    @property
-    def tile_extension(self) -> str:
-        if self.__format in [
-            "TIFF_RAW_UINT8",
-            "TIFF_LZW_UINT8",
-            "TIFF_ZIP_UINT8",
-            "TIFF_PKB_UINT8",
-            "TIFF_RAW_FLOAT32",
-            "TIFF_LZW_FLOAT32",
-            "TIFF_ZIP_FLOAT32",
-            "TIFF_PKB_FLOAT32",
-        ]:
-            return "tif"
-        elif self.__format in ["TIFF_JPG_UINT8", "TIFF_JPG90_UINT8"]:
-            return "jpg"
-        elif self.__format == "TIFF_PNG_UINT8":
-            return "png"
-        elif self.__format == "TIFF_PBF_MVT":
-            return "pbf"
-        else:
-            raise Exception(
-                f"Unknown pyramid's format ({self.__format}), cannot return the tile extension"
-            )
-
-    @property
-    def bottom_level(self) -> "Level":
-        """Get the best resolution level in the pyramid
-
-        Returns:
-            Level: the bottom level
-        """
-        return sorted(self.__levels.values(), key=lambda level: level.resolution)[0]
-
-    @property
-    def top_level(self) -> "Level":
-        """Get the low resolution level in the pyramid
-
-        Returns:
-            Level: the top level
-        """
-        return sorted(self.__levels.values(), key=lambda level: level.resolution)[-1]
-
-    @property
-    def type(self) -> PyramidType:
-        """Get the pyramid's type (RASTER or VECTOR) from its format
-
-        Returns:
-            PyramidType: RASTER or VECTOR
-        """
-        if self.__format == "TIFF_PBF_MVT":
-            return PyramidType.VECTOR
-        else:
-            return PyramidType.RASTER
-
-    def load_list(self) -> int:
-        """Load list content and cache it
-
-        If list is already loaded, nothing done
-        """
-        if self.__content["loaded"]:
-            return self.__content["count"]
-
-        for slab, infos in self.list_generator():
-            self.__content["cache"][slab] = infos
-            self.__content["count"] += 1
-
-        self.__content["loaded"] = True
-
-        return self.__content["count"]
-
-    def list_generator(
-        self, level_id: str = None
-    ) -> Iterator[Tuple[Tuple[SlabType, str, int, int], Dict]]:
-        """Get list content
-
-        List is copied as temporary file, roots are read and informations about each slab is returned. If list is already loaded, we yield the cached content
-        Args :
-            level_id (str) : id of the level for load only one level
-
-        Examples:
-
-            S3 stored descriptor
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-
-                    for (slab_type, level, column, row), infos in pyramid.list_generator():
-                        print(infos)
-
-                except Exception as e:
-                    print("Cannot load the pyramid from its descriptor and read the list")
-
-        Yields:
-            Iterator[Tuple[Tuple[SlabType,str,int,int], Dict]]: Slab indices and storage informations
-
-            Value example:
-
-                (
-                    (<SlabType.DATA: 'DATA'>, '18', 5424, 7526),
-                    {
-                        'link': False,
-                        'md5': None,
-                        'root': 'pyramids@localhost:9000/LIMADM',
-                        'slab': 'DATA_18_5424_7526'
-                    }
-                )
-
-        Raises:
-            StorageError: Unhandled pyramid storage to copy list
-            MissingEnvironmentError: Missing object storage informations
-        """
-        if self.__content["loaded"]:
-            for slab, infos in self.__content["cache"].items():
-                if level_id is not None:
-                    if slab[1] == level_id:
-                        yield slab, infos
-                else:
-                    yield slab, infos
-        else:
-            # Copie de la liste dans un fichier temporaire (cette liste peut être un objet)
-            list_obj = tempfile.NamedTemporaryFile(mode="r", delete=False)
-            list_file = list_obj.name
-            copy(self.__list, f"file://{list_file}")
-            list_obj.close()
-
-            roots = {}
-            s3_cluster = self.storage_s3_cluster
-
-            with open(list_file) as listin:
-                # Lecture des racines
-                for line in listin:
-                    line = line.rstrip()
-
-                    if line == "#":
-                        break
-
-                    root_id, root_path = line.split("=", 1)
-
-                    if s3_cluster is None:
-                        roots[root_id] = root_path
-                    else:
-                        # On a un nom de cluster S3, on l'ajoute au nom du bucket dans les racines
-                        root_bucket, root_path = root_path.split("/", 1)
-                        roots[root_id] = f"{root_bucket}@{s3_cluster}/{root_path}"
-
-                # Lecture des dalles
-                for line in listin:
-                    line = line.rstrip()
-
-                    parts = line.split(" ", 1)
-                    slab_path = parts[0]
-                    slab_md5 = None
-                    if len(parts) == 2:
-                        slab_md5 = parts[1]
-
-                    root_id, slab_path = slab_path.split("/", 1)
-
-                    slab_type, level, column, row = self.get_infos_from_slab_path(slab_path)
-                    infos = {
-                        "root": roots[root_id],
-                        "link": root_id != "0",
-                        "slab": slab_path,
-                        "md5": slab_md5,
-                    }
-
-                    if level_id is not None:
-                        if level == level_id:
-                            yield ((slab_type, level, column, row), infos)
-                    else:
-                        yield ((slab_type, level, column, row), infos)
-
-            remove(f"file://{list_file}")
-
-    def get_level(self, level_id: str) -> "Level":
-        """Get one level according to its identifier
-
-        Args:
-            level_id: Level identifier
-
-        Returns:
-            The corresponding pyramid's level, None if not present
-        """
-
-        return self.__levels.get(level_id, None)
-
-    def get_levels(self, bottom_id: str = None, top_id: str = None) -> List[Level]:
-        """Get sorted levels in the provided range from bottom to top
-
-        Args:
-            bottom_id (str, optionnal): specific bottom level id. Defaults to None.
-            top_id (str, optionnal): specific top level id. Defaults to None.
-
-        Raises:
-            Exception: Provided levels are not consistent (bottom > top or not in the pyramid)
-
-        Examples:
-
-            All levels
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-                    levels = pyramid.get_levels()
-
-                except Exception as e:
-                    print("Cannot load the pyramid from its descriptor and get levels")
-
-            From pyramid's bottom to provided top (level 5)
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-                    levels = pyramid.get_levels(None, "5")
-
-                except Exception as e:
-                    print("Cannot load the pyramid from its descriptor and get levels")
-
-        Returns:
-            List[Level]: asked sorted levels
-        """
-
-        sorted_levels = sorted(self.__levels.values(), key=lambda level: level.resolution)
-
-        levels = []
-
-        begin = False
-        if bottom_id is None:
-            # Pas de niveau du bas fourni, on commence tout en bas
-            begin = True
-        else:
-            if self.get_level(bottom_id) is None:
-                raise Exception(
-                    f"Pyramid {self.name} does not contain the provided bottom level {bottom_id}"
-                )
-
-        if top_id is not None and self.get_level(top_id) is None:
-            raise Exception(f"Pyramid {self.name} does not contain the provided top level {top_id}")
-
-        end = False
-
-        for level in sorted_levels:
-            if not begin and level.id == bottom_id:
-                begin = True
-
-            if begin:
-                levels.append(level)
-                if top_id is not None and level.id == top_id:
-                    end = True
-                    break
-                else:
-                    continue
-
-        if top_id is None:
-            # Pas de niveau du haut fourni, on a été jusqu'en haut et c'est normal
-            end = True
-
-        if not begin or not end:
-            raise Exception(
-                f"Provided levels ids are not consistent to extract levels from the pyramid {self.name}"
-            )
-
-        return levels
-
-    def write_descriptor(self) -> None:
-        """Write the pyramid's descriptor to the final location (in the pyramid's storage root)"""
-
-        content = json.dumps(self.serializable)
-        put_data_str(content, self.__descriptor)
-
-    def get_infos_from_slab_path(self, path: str) -> Tuple[SlabType, str, int, int]:
-        """Get the slab's indices from its storage path
-
-        Args:
-            path (str): Slab's storage path
-
-        Examples:
-
-            FILE stored pyramid
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("/path/to/descriptor.json")
-                    slab_type, level, column, row = self.get_infos_from_slab_path("DATA/12/00/4A/F7.tif")
-                    # (SlabType.DATA, "12", 159, 367)
-                except Exception as e:
-                    print("Cannot load the pyramid from its descriptor and convert a slab path")
-
-            S3 stored pyramid
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/pyramid.json")
-                    slab_type, level, column, row = self.get_infos_from_slab_path("s3://bucket_name/path/to/pyramid/MASK_15_9164_5846")
-                    # (SlabType.MASK, "15", 9164, 5846)
-                except Exception as e:
-                    print("Cannot load the pyramid from its descriptor and convert a slab path")
-
-        Returns:
-            Tuple[SlabType, str, int, int]: Slab's type (DATA or MASK), level identifier, slab's column and slab's row
-        """
-        if self.__storage["type"] == StorageType.FILE:
-            parts = path.split("/")
-
-            # Le partie du chemin qui contient la colonne et ligne de la dalle est à la fin, en fonction de la profondeur choisie
-            # depth = 2 -> on doit utiliser les 3 dernières parties pour la conversion
-            column, row = b36_path_decode("/".join(parts[-(self.__storage["depth"] + 1) :]))
-            level = parts[-(self.__storage["depth"] + 2)]
-            raw_slab_type = parts[-(self.__storage["depth"] + 3)]
-
-            # Pour être retro compatible avec l'ancien nommage
-            if raw_slab_type == "IMAGE":
-                raw_slab_type = "DATA"
-
-            slab_type = SlabType[raw_slab_type]
-
-            return slab_type, level, column, row
-        else:
-            parts = re.split(r"[/_]", path)
-            column = parts[-2]
-            row = parts[-1]
-            level = parts[-3]
-            raw_slab_type = parts[-4]
-
-            # Pour être retro compatible avec l'ancien nommage
-            if raw_slab_type == "IMG":
-                raw_slab_type = "DATA"
-            elif raw_slab_type == "MSK":
-                raw_slab_type = "MASK"
-
-            slab_type = SlabType[raw_slab_type]
-
-            return slab_type, level, int(column), int(row)
-
-    def get_slab_path_from_infos(
-        self, slab_type: SlabType, level: str, column: int, row: int, full: bool = True
-    ) -> str:
-        """Get slab's storage path from the indices
-
-        Args:
-            slab_type (SlabType): DATA or MASK
-            level (str): Level identifier
-            column (int): Slab's column
-            row (int): Slab's row
-            full (bool, optional): Full path or just relative path from pyramid storage root. Defaults to True.
-
-        Returns:
-            str: Absolute or relative slab's storage path
-        """
-        if self.__storage["type"] == StorageType.FILE:
-            slab_path = os.path.join(
-                slab_type.value, level, b36_path_encode(column, row, self.__storage["depth"])
-            )
-        else:
-            slab_path = f"{slab_type.value}_{level}_{column}_{row}"
-
-        if full:
-            return get_path_from_infos(
-                self.__storage["type"], self.__storage["root"], self.__name, slab_path
-            )
-        else:
-            return slab_path
-
-    def get_tile_data_binary(self, level: str, column: int, row: int) -> str:
-        """Get a pyramid's tile as binary string
-
-        To get a tile, 3 steps :
-            * calculate slab path from tile index
-            * read slab index to get offsets and sizes of slab's tiles
-            * read the tile into the slab
-
-        Args:
-            level (str): Tile's level
-            column (int): Tile's column
-            row (int): Tile's row
-
-        Limitations:
-            Pyramids with one-tile slab are not handled
-
-        Examples:
-
-            FILE stored raster pyramid, to extract a tile containing a point and save it as independent image
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("/data/pyramids/SCAN1000.json")
-                    level, col, row, pcol, prow = pyramid.get_tile_indices(992904.46, 6733643.15, "9", srs = "IGNF:LAMB93")
-                    data = pyramid.get_tile_data_binary(level, col, row)
-
-                    if data is None:
-                        print("No data")
-                    else:
-                        tile_name = f"tile_{level}_{col}_{row}.{pyramid.tile_extension}"
-                        with open(tile_name, "wb") as image:
-                            image.write(data)
-                        print (f"Tile written in {tile_name}")
-
-                except Exception as e:
-                    print("Cannot save a pyramid's tile : {e}")
-
-        Raises:
-            Exception: Level not found in the pyramid
-            NotImplementedError: Pyramid owns one-tile slabs
-            MissingEnvironmentError: Missing object storage informations
-            StorageError: Storage read issue
-
-        Returns:
-            str: data, as binary string, None if no data
-        """
-
-        level_object = self.get_level(level)
-
-        if level_object is None:
-            raise Exception(f"No level {level} in the pyramid")
-
-        if level_object.slab_width == 1 and level_object.slab_height == 1:
-            raise NotImplementedError("One-tile slab pyramid is not handled")
-
-        if not level_object.is_in_limits(column, row):
-            return None
-
-        # Indices de la dalle
-        slab_column = column // level_object.slab_width
-        slab_row = row // level_object.slab_height
-
-        # Indices de la tuile dans la dalle
-        relative_tile_column = column % level_object.slab_width
-        relative_tile_row = row % level_object.slab_height
-
-        # Numéro de la tuile dans le header
-        tile_index = relative_tile_row * level_object.slab_width + relative_tile_column
-
-        # Calcul du chemin de la dalle contenant la tuile voulue
-        slab_path = self.get_slab_path_from_infos(SlabType.DATA, level, slab_column, slab_row)
-
-        # Récupération des offset et tailles des tuiles dans la dalle
-        # Une dalle ROK4 a une en-tête fixe de 2048 octets,
-        # puis sont stockés les offsets (chacun sur 4 octets)
-        # puis les tailles (chacune sur 4 octets)
-        try:
-            binary_index = get_data_binary(
-                slab_path,
-                (
-                    ROK4_IMAGE_HEADER_SIZE,
-                    2 * 4 * level_object.slab_width * level_object.slab_height,
-                ),
-            )
-        except FileNotFoundError:
-            # L'absence de la dalle est gérée comme simplement une absence de données
-            return None
-
-        offsets = numpy.frombuffer(
-            binary_index,
-            dtype=numpy.dtype("uint32"),
-            count=level_object.slab_width * level_object.slab_height,
-        )
-        sizes = numpy.frombuffer(
-            binary_index,
-            dtype=numpy.dtype("uint32"),
-            offset=4 * level_object.slab_width * level_object.slab_height,
-            count=level_object.slab_width * level_object.slab_height,
-        )
-
-        if sizes[tile_index] == 0:
-            return None
-
-        return get_data_binary(slab_path, (offsets[tile_index], sizes[tile_index]))
-
-    def get_tile_data_raster(self, level: str, column: int, row: int) -> numpy.ndarray:
-        """Get a raster pyramid's tile as 3-dimension numpy ndarray
-
-        First dimension is the row, second one is column, third one is band.
-
-        Args:
-            level (str): Tile's level
-            column (int): Tile's column
-            row (int): Tile's row
-
-        Limitations:
-            Packbits (pyramid formats TIFF_PKB_FLOAT32 and TIFF_PKB_UINT8) and LZW (pyramid formats TIFF_LZW_FLOAT32 and TIFF_LZW_UINT8) compressions are not handled.
-
-        Raises:
-            Exception: Cannot get raster data for a vector pyramid
-            Exception: Level not found in the pyramid
-            NotImplementedError: Pyramid owns one-tile slabs
-            NotImplementedError: Raster pyramid format not handled
-            MissingEnvironmentError: Missing object storage informations
-            StorageError: Storage read issue
-            FormatError: Cannot decode tile
-
-        Examples:
-
-            FILE stored DTM (raster) pyramid, to get the altitude value at a point in the best level
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("/data/pyramids/RGEALTI.json")
-                    level, col, row, pcol, prow = pyramid.get_tile_indices(44, 5, srs = "EPSG:4326")
-                    data = pyramid.get_tile_data_raster(level, col, row)
-
-                    if data is None:
-                        print("No data")
-                    else:
-                        print(data[prow][pcol])
-
-                except Exception as e:
-                    print("Cannot get a pyramid's pixel value : {e}")
-
-        Returns:
-            str: data, as numpy array, None if no data
-        """
-
-        if self.type == PyramidType.VECTOR:
-            raise Exception("Cannot get tile as raster data : it's a vector pyramid")
-
-        binary_tile = self.get_tile_data_binary(level, column, row)
-
-        if binary_tile is None:
-            return None
-
-        level_object = self.get_level(level)
-
-        if self.__format == "TIFF_JPG_UINT8" or self.__format == "TIFF_JPG90_UINT8":
-            try:
-                img = Image.open(io.BytesIO(binary_tile))
-            except Exception as e:
-                raise FormatError("JPEG", "binary tile", e)
-
-            data = numpy.asarray(img)
-            data.shape = (
-                level_object.tile_matrix.tile_size[0],
-                level_object.tile_matrix.tile_size[1],
-                self.__raster_specifications["channels"],
-            )
-
-        elif self.__format == "TIFF_RAW_UINT8":
-            data = numpy.frombuffer(binary_tile, dtype=numpy.dtype("uint8"))
-            data.shape = (
-                level_object.tile_matrix.tile_size[0],
-                level_object.tile_matrix.tile_size[1],
-                self.__raster_specifications["channels"],
-            )
-
-        elif self.__format == "TIFF_PNG_UINT8":
-            try:
-                img = Image.open(io.BytesIO(binary_tile))
-            except Exception as e:
-                raise FormatError("PNG", "binary tile", e)
-
-            data = numpy.asarray(img)
-            data.shape = (
-                level_object.tile_matrix.tile_size[0],
-                level_object.tile_matrix.tile_size[1],
-                self.__raster_specifications["channels"],
-            )
-
-        elif self.__format == "TIFF_ZIP_UINT8":
-            try:
-                data = numpy.frombuffer(zlib.decompress(binary_tile), dtype=numpy.dtype("uint8"))
-            except Exception as e:
-                raise FormatError("ZIP", "binary tile", e)
-
-            data.shape = (
-                level_object.tile_matrix.tile_size[0],
-                level_object.tile_matrix.tile_size[1],
-                self.__raster_specifications["channels"],
-            )
-
-        elif self.__format == "TIFF_ZIP_FLOAT32":
-            try:
-                data = numpy.frombuffer(zlib.decompress(binary_tile), dtype=numpy.dtype("float32"))
-            except Exception as e:
-                raise FormatError("ZIP", "binary tile", e)
-
-            data.shape = (
-                level_object.tile_matrix.tile_size[0],
-                level_object.tile_matrix.tile_size[1],
-                self.__raster_specifications["channels"],
-            )
-
-        elif self.__format == "TIFF_RAW_FLOAT32":
-            data = numpy.frombuffer(binary_tile, dtype=numpy.dtype("float32"))
-            data.shape = (
-                level_object.tile_matrix.tile_size[0],
-                level_object.tile_matrix.tile_size[1],
-                self.__raster_specifications["channels"],
-            )
-
-        else:
-            raise NotImplementedError(f"Cannot get tile as raster data for format {self.__format}")
-
-        return data
-
-    def get_tile_data_vector(self, level: str, column: int, row: int) -> Dict:
-        """Get a vector pyramid's tile as GeoJSON dictionnary
-
-        Args:
-            level (str): Tile's level
-            column (int): Tile's column
-            row (int): Tile's row
-
-        Raises:
-            Exception: Cannot get vector data for a raster pyramid
-            Exception: Level not found in the pyramid
-            NotImplementedError: Pyramid owns one-tile slabs
-            NotImplementedError: Vector pyramid format not handled
-            MissingEnvironmentError: Missing object storage informations
-            StorageError: Storage read issue
-            FormatError: Cannot decode tile
-
-        Examples:
-
-            S3 stored vector pyramid, to print a tile as GeoJSON
-
-                from rok4.pyramid import Pyramid
-
-                import json
-
-                try:
-                    pyramid = Pyramid.from_descriptor("s3://pyramids/vectors/BDTOPO.json")
-                    level, col, row, pcol, prow = pyramid.get_tile_indices(40.325, 3.123, srs = "EPSG:4326")
-                    data = pyramid.get_tile_data_vector(level, col, row)
-
-                    if data is None:
-                        print("No data")
-                    else:
-                        print(json.dumps(data))
-
-                except Exception as e:
-                    print("Cannot print a vector pyramid's tile as GeoJSON : {e}")
-
-        Returns:
-            str: data, as GeoJSON dictionnary. None if no data
-        """
-
-        if self.type == PyramidType.RASTER:
-            raise Exception("Cannot get tile as vector data : it's a raster pyramid")
-
-        binary_tile = self.get_tile_data_binary(level, column, row)
-
-        if binary_tile is None:
-            return None
-
-        self.get_level(level)
-
-        if self.__format == "TIFF_PBF_MVT":
-            try:
-                data = mapbox_vector_tile.decode(binary_tile)
-            except Exception as e:
-                raise FormatError("PBF (MVT)", "binary tile", e)
-        else:
-            raise NotImplementedError(f"Cannot get tile as vector data for format {self.__format}")
-
-        return data
-
-    def get_tile_indices(
-        self, x: float, y: float, level: str = None, **kwargs
-    ) -> Tuple[str, int, int, int, int]:
-        """Get pyramid's tile and pixel indices from point's coordinates
-
-        Used coordinates system have to be the pyramid one. If EPSG:4326, x is latitude and y longitude.
-
-        Args:
-            x (float): point's x
-            y (float): point's y
-            level (str, optional): Pyramid's level to take into account, the bottom one if None . Defaults to None.
-            **srs (string): spatial reference system of provided coordinates, with authority and code (same as the pyramid's one if not provided)
-
-        Raises:
-            Exception: Cannot find level to calculate indices
-            RuntimeError: Provided SRS is invalid for OSR
-
-        Examples:
-
-            FILE stored DTM (raster) pyramid, to get the altitude value at a point in the best level
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("/data/pyramids/RGEALTI.json")
-                    level, col, row, pcol, prow = pyramid.get_tile_indices(44, 5, srs = "EPSG:4326")
-                    data = pyramid.get_tile_data_raster(level, col, row)
-
-                    if data is None:
-                        print("No data")
-                    else:
-                        print(data[prow][pcol])
-
-                except Exception as e:
-                    print("Cannot get a pyramid's pixel value : {e}")
-
-        Returns:
-            Tuple[str, int, int, int, int]: Level identifier, tile's column, tile's row, pixel's (in the tile) column, pixel's row
-        """
-
-        level_object = self.bottom_level
-        if level is not None:
-            level_object = self.get_level(level)
-
-        if level_object is None:
-            raise Exception("Cannot found the level to calculate indices")
-
-        if (
-            "srs" in kwargs
-            and kwargs["srs"] is not None
-            and kwargs["srs"].upper() != self.__tms.srs.upper()
-        ):
-            sr = srs_to_spatialreference(kwargs["srs"])
-            x, y = reproject_point((x, y), sr, self.__tms.sr)
-
-        return (level_object.id,) + level_object.tile_matrix.point_to_indices(x, y)
-
-    def delete_level(self, level_id: str) -> None:
-        """Delete the given level in the description of the pyramid
-
-        Args:
-            level_id: Level identifier
-
-        Raises:
-            Exception: Cannot find level
-        """
-
-        try:
-            del self.__levels[level_id]
-        except Exception:
-            raise Exception(f"The level {level_id} don't exist in the pyramid")
-
-    def add_level(
-        self,
-        level_id: str,
-        tiles_per_width: int,
-        tiles_per_height: int,
-        tile_limits: Dict[str, int],
-    ) -> None:
-        """Add a level in the description of the pyramid
-
-        Args:
-            level_id: Level identifier
-            tiles_per_width : Number of tiles in width by slab
-            tiles_per_height : Number of tiles in height by slab
-            tile_limits : Minimum and maximum tiles' columns and rows of pyramid's content
-        """
-
-        data = {
-            "id": level_id,
-            "tile_limits": tile_limits,
-            "tiles_per_width": tiles_per_width,
-            "tiles_per_height": tiles_per_height,
-            "storage": {"type": self.storage_type.name},
-        }
-        if self.own_masks:
-            data["storage"]["mask_prefix"] = True
-        if self.storage_type == StorageType.FILE:
-            data["storage"]["path_depth"] = self.storage_depth
-
-        lev = Level.from_descriptor(data, self)
-
-        if self.__tms.get_level(lev.id) is None:
-            raise Exception(
-                f"Pyramid {self.name} owns a level with the ID '{lev.id}', not defined in the TMS '{self.tms.name}'"
-            )
-        else:
-            self.__levels[lev.id] = lev
-
-    @property
-    def size(self) -> int:
-        """Get the size of the pyramid
-
-        Examples:
-
-                from rok4.pyramid import Pyramid
-
-                try:
-                    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-                    size = pyramid.size()
-
-                except Exception as e:
-                    print("Cannot load the pyramid from its descriptor and get his size")
-
-        Returns:
-            int: size of the pyramid
-        """
-
-        if not hasattr(self, "_Pyramid__size"):
-            self.__size = size_path(
-                get_path_from_infos(self.__storage["type"], self.__storage["root"], self.__name)
-            )
-
-        return self.__size
-
-

Static methods

-
-
-def from_descriptor(descriptor: str) ‑> Pyramid -
-
-

Create a pyramid from its descriptor

-

Args

-
-
descriptor : str
-
pyramid's descriptor path
-
-

Raises

-
-
FormatError
-
Provided path or the descriptor is not a well formed JSON
-
Exception
-
Level issue : no one in the pyramid or the used TMS, or level ID not defined in the TMS
-
MissingAttributeError
-
Attribute is missing in the content
-
StorageError
-
Storage read issue (pyramid descriptor or TMS)
-
MissingEnvironmentError
-
Missing object storage informations or TMS root directory
-
-

Examples

-

S3 stored descriptor

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-except Exception as e:
-    print("Cannot load the pyramid from its descriptor")
-
-

Returns

-
-
Pyramid
-
a Pyramid instance
-
-
-
-def from_other(other: Pyramid, name: str, storage: Dict[~KT, ~VT], **kwargs) ‑> Pyramid -
-
-

Create a pyramid from another one

-

Args

-
-
other : Pyramid
-
pyramid to clone
-
name : str
-
new pyramid's name
-
storage : Dict[str, Union[str, int]]
-
new pyramid's storage informations
-
-

**mask (bool) : Presence or not of mask (only for RASTER)

-

Raises

-
-
FormatError
-
Provided path or the TMS is not a well formed JSON
-
Exception
-
Level issue : no one in the pyramid or the used TMS, or level ID not defined in the TMS
-
MissingAttributeError
-
Attribute is missing in the content
-
-

Returns

-
-
Pyramid
-
a Pyramid instance
-
-
-
-

Instance variables

-
-
prop bottom_levelLevel
-
-

Get the best resolution level in the pyramid

-

Returns

-
-
Level
-
the bottom level
-
-
- -Expand source code - -
@property
-def bottom_level(self) -> "Level":
-    """Get the best resolution level in the pyramid
-
-    Returns:
-        Level: the bottom level
-    """
-    return sorted(self.__levels.values(), key=lambda level: level.resolution)[0]
-
-
-
prop channels : str
-
-
-
- -Expand source code - -
@property
-def channels(self) -> str:
-    return self.raster_specifications["channels"]
-
-
-
prop descriptor : str
-
-
-
- -Expand source code - -
@property
-def descriptor(self) -> str:
-    return self.__descriptor
-
-
-
prop format : str
-
-
-
- -Expand source code - -
@property
-def format(self) -> str:
-    return self.__format
-
-
-
prop list : str
-
-
-
- -Expand source code - -
@property
-def list(self) -> str:
-    return self.__list
-
-
-
prop name : str
-
-
-
- -Expand source code - -
@property
-def name(self) -> str:
-    return self.__name
-
-
-
prop own_masks : bool
-
-
-
- -Expand source code - -
@property
-def own_masks(self) -> bool:
-    return self.__masks
-
-
-
prop raster_specifications : Dict[~KT, ~VT]
-
-

Get raster specifications for a RASTER pyramid

-

Example

-

RGB pyramid with red nodata

-
{
-    "channels": 3,
-    "nodata": "255,0,0",
-    "photometric": "rgb",
-    "interpolation": "bicubic"
-}
-
-

Returns

-
-
Dict
-
Raster specifications, None if VECTOR pyramid
-
-
- -Expand source code - -
@property
-def raster_specifications(self) -> Dict:
-    """Get raster specifications for a RASTER pyramid
-
-    Example:
-
-        RGB pyramid with red nodata
-
-            {
-                "channels": 3,
-                "nodata": "255,0,0",
-                "photometric": "rgb",
-                "interpolation": "bicubic"
-            }
-
-    Returns:
-        Dict: Raster specifications, None if VECTOR pyramid
-    """
-    return self.__raster_specifications
-
-
-
prop serializable : Dict[~KT, ~VT]
-
-

Get the dict version of the pyramid object, descriptor compliant

-

Returns

-
-
Dict
-
descriptor structured object description
-
-
- -Expand source code - -
@property
-def serializable(self) -> Dict:
-    """Get the dict version of the pyramid object, descriptor compliant
-
-    Returns:
-        Dict: descriptor structured object description
-    """
-
-    serialization = {"tile_matrix_set": self.__tms.name, "format": self.__format}
-
-    serialization["levels"] = []
-    sorted_levels = sorted(
-        self.__levels.values(), key=lambda level: level.resolution, reverse=True
-    )
-
-    for level in sorted_levels:
-        serialization["levels"].append(level.serializable)
-
-    if self.type == PyramidType.RASTER:
-        serialization["raster_specifications"] = self.__raster_specifications
-
-    if self.__masks:
-        serialization["mask_format"] = "TIFF_ZIP_UINT8"
-
-    return serialization
-
-
-
prop size : int
-
-

Get the size of the pyramid

-

Examples

-

from rok4.pyramid import Pyramid

-

try: -pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json") -size = pyramid.size()

-

except Exception as e: -print("Cannot load the pyramid from its descriptor and get his size")

-

Returns

-
-
int
-
size of the pyramid
-
-
- -Expand source code - -
@property
-def size(self) -> int:
-    """Get the size of the pyramid
-
-    Examples:
-
-            from rok4.pyramid import Pyramid
-
-            try:
-                pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-                size = pyramid.size()
-
-            except Exception as e:
-                print("Cannot load the pyramid from its descriptor and get his size")
-
-    Returns:
-        int: size of the pyramid
-    """
-
-    if not hasattr(self, "_Pyramid__size"):
-        self.__size = size_path(
-            get_path_from_infos(self.__storage["type"], self.__storage["root"], self.__name)
-        )
-
-    return self.__size
-
-
-
prop storage_depth : int
-
-
-
- -Expand source code - -
@property
-def storage_depth(self) -> int:
-    return self.__storage.get("depth", None)
-
-
-
prop storage_root : str
-
-

Get the pyramid's storage root.

-

If storage is S3, the used cluster is removed.

-

Returns

-
-
str
-
Pyramid's storage root
-
-
- -Expand source code - -
@property
-def storage_root(self) -> str:
-    """Get the pyramid's storage root.
-
-    If storage is S3, the used cluster is removed.
-
-    Returns:
-        str: Pyramid's storage root
-    """
-
-    return self.__storage["root"].split("@", 1)[
-        0
-    ]  # Suppression de l'éventuel hôte de spécification du cluster S3
-
-
-
prop storage_s3_cluster : str
-
-

Get the pyramid's storage S3 cluster (host name)

-

Returns

-
-
str
-
the host if known, None if the default one have to be used or if storage is not S3
-
-
- -Expand source code - -
@property
-def storage_s3_cluster(self) -> str:
-    """Get the pyramid's storage S3 cluster (host name)
-
-    Returns:
-        str: the host if known, None if the default one have to be used or if storage is not S3
-    """
-    if self.__storage["type"] == StorageType.S3:
-        try:
-            return self.__storage["root"].split("@")[1]
-        except IndexError:
-            return None
-    else:
-        return None
-
-
-
prop storage_typeStorageType
-
-

Get the storage type

-

Returns

-
-
StorageType
-
FILE, S3 or CEPH
-
-
- -Expand source code - -
@property
-def storage_type(self) -> StorageType:
-    """Get the storage type
-
-    Returns:
-        StorageType: FILE, S3 or CEPH
-    """
-    return self.__storage["type"]
-
-
-
prop tile_extension : str
-
-
-
- -Expand source code - -
@property
-def tile_extension(self) -> str:
-    if self.__format in [
-        "TIFF_RAW_UINT8",
-        "TIFF_LZW_UINT8",
-        "TIFF_ZIP_UINT8",
-        "TIFF_PKB_UINT8",
-        "TIFF_RAW_FLOAT32",
-        "TIFF_LZW_FLOAT32",
-        "TIFF_ZIP_FLOAT32",
-        "TIFF_PKB_FLOAT32",
-    ]:
-        return "tif"
-    elif self.__format in ["TIFF_JPG_UINT8", "TIFF_JPG90_UINT8"]:
-        return "jpg"
-    elif self.__format == "TIFF_PNG_UINT8":
-        return "png"
-    elif self.__format == "TIFF_PBF_MVT":
-        return "pbf"
-    else:
-        raise Exception(
-            f"Unknown pyramid's format ({self.__format}), cannot return the tile extension"
-        )
-
-
-
prop tmsTileMatrixSet
-
-
-
- -Expand source code - -
@property
-def tms(self) -> TileMatrixSet:
-    return self.__tms
-
-
-
prop top_levelLevel
-
-

Get the low resolution level in the pyramid

-

Returns

-
-
Level
-
the top level
-
-
- -Expand source code - -
@property
-def top_level(self) -> "Level":
-    """Get the low resolution level in the pyramid
-
-    Returns:
-        Level: the top level
-    """
-    return sorted(self.__levels.values(), key=lambda level: level.resolution)[-1]
-
-
-
prop typePyramidType
-
-

Get the pyramid's type (RASTER or VECTOR) from its format

-

Returns

-
-
PyramidType
-
RASTER or VECTOR
-
-
- -Expand source code - -
@property
-def type(self) -> PyramidType:
-    """Get the pyramid's type (RASTER or VECTOR) from its format
-
-    Returns:
-        PyramidType: RASTER or VECTOR
-    """
-    if self.__format == "TIFF_PBF_MVT":
-        return PyramidType.VECTOR
-    else:
-        return PyramidType.RASTER
-
-
-
-

Methods

-
-
-def add_level(self, level_id: str, tiles_per_width: int, tiles_per_height: int, tile_limits: Dict[str, int]) ‑> None -
-
-

Add a level in the description of the pyramid

-

Args

-
-
level_id
-
Level identifier
-
-

tiles_per_width : Number of tiles in width by slab -tiles_per_height : Number of tiles in height by slab -tile_limits : Minimum and maximum tiles' columns and rows of pyramid's content

-
-
-def delete_level(self, level_id: str) ‑> None -
-
-

Delete the given level in the description of the pyramid

-

Args

-
-
level_id
-
Level identifier
-
-

Raises

-
-
Exception
-
Cannot find level
-
-
-
-def get_infos_from_slab_path(self, path: str) ‑> Tuple[SlabType, str, int, int] -
-
-

Get the slab's indices from its storage path

-

Args

-
-
path : str
-
Slab's storage path
-
-

Examples

-

FILE stored pyramid

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("/path/to/descriptor.json")
-    slab_type, level, column, row = self.get_infos_from_slab_path("DATA/12/00/4A/F7.tif")
-    # (SlabType.DATA, "12", 159, 367)
-except Exception as e:
-    print("Cannot load the pyramid from its descriptor and convert a slab path")
-
-

S3 stored pyramid

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/pyramid.json")
-    slab_type, level, column, row = self.get_infos_from_slab_path("s3://bucket_name/path/to/pyramid/MASK_15_9164_5846")
-    # (SlabType.MASK, "15", 9164, 5846)
-except Exception as e:
-    print("Cannot load the pyramid from its descriptor and convert a slab path")
-
-

Returns

-
-
Tuple[SlabType, str, int, int]
-
Slab's type (DATA or MASK), level identifier, slab's column and slab's row
-
-
-
-def get_level(self, level_id: str) ‑> Level -
-
-

Get one level according to its identifier

-

Args

-
-
level_id
-
Level identifier
-
-

Returns

-

The corresponding pyramid's level, None if not present

-
-
-def get_levels(self, bottom_id: str = None, top_id: str = None) ‑> List[Level] -
-
-

Get sorted levels in the provided range from bottom to top

-

Args

-
-
bottom_id : str, optionnal
-
specific bottom level id. Defaults to None.
-
top_id : str, optionnal
-
specific top level id. Defaults to None.
-
-

Raises

-
-
Exception
-
Provided levels are not consistent (bottom > top or not in the pyramid)
-
-

Examples

-

All levels

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-    levels = pyramid.get_levels()
-
-except Exception as e:
-    print("Cannot load the pyramid from its descriptor and get levels")
-
-

From pyramid's bottom to provided top (level 5)

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-    levels = pyramid.get_levels(None, "5")
-
-except Exception as e:
-    print("Cannot load the pyramid from its descriptor and get levels")
-
-

Returns

-
-
List[Level]
-
asked sorted levels
-
-
-
-def get_slab_path_from_infos(self, slab_type: SlabType, level: str, column: int, row: int, full: bool = True) ‑> str -
-
-

Get slab's storage path from the indices

-

Args

-
-
slab_type : SlabType
-
DATA or MASK
-
level : str
-
Level identifier
-
column : int
-
Slab's column
-
row : int
-
Slab's row
-
full : bool, optional
-
Full path or just relative path from pyramid storage root. Defaults to True.
-
-

Returns

-
-
str
-
Absolute or relative slab's storage path
-
-
-
-def get_tile_data_binary(self, level: str, column: int, row: int) ‑> str -
-
-

Get a pyramid's tile as binary string

-

To get a tile, 3 steps : -* calculate slab path from tile index -* read slab index to get offsets and sizes of slab's tiles -* read the tile into the slab

-

Args

-
-
level : str
-
Tile's level
-
column : int
-
Tile's column
-
row : int
-
Tile's row
-
-

Limitations

-

Pyramids with one-tile slab are not handled

-

Examples

-

FILE stored raster pyramid, to extract a tile containing a point and save it as independent image

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("/data/pyramids/SCAN1000.json")
-    level, col, row, pcol, prow = pyramid.get_tile_indices(992904.46, 6733643.15, "9", srs = "IGNF:LAMB93")
-    data = pyramid.get_tile_data_binary(level, col, row)
-
-    if data is None:
-        print("No data")
-    else:
-        tile_name = f"tile_{level}_{col}_{row}.{pyramid.tile_extension}"
-        with open(tile_name, "wb") as image:
-            image.write(data)
-        print (f"Tile written in {tile_name}")
-
-except Exception as e:
-    print("Cannot save a pyramid's tile : {e}")
-
-

Raises

-
-
Exception
-
Level not found in the pyramid
-
NotImplementedError
-
Pyramid owns one-tile slabs
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
-

Returns

-
-
str
-
data, as binary string, None if no data
-
-
-
-def get_tile_data_raster(self, level: str, column: int, row: int) ‑> numpy.ndarray -
-
-

Get a raster pyramid's tile as 3-dimension numpy ndarray

-

First dimension is the row, second one is column, third one is band.

-

Args

-
-
level : str
-
Tile's level
-
column : int
-
Tile's column
-
row : int
-
Tile's row
-
-

Limitations

-

Packbits (pyramid formats TIFF_PKB_FLOAT32 and TIFF_PKB_UINT8) and LZW (pyramid formats TIFF_LZW_FLOAT32 and TIFF_LZW_UINT8) compressions are not handled.

-

Raises

-
-
Exception
-
Cannot get raster data for a vector pyramid
-
Exception
-
Level not found in the pyramid
-
NotImplementedError
-
Pyramid owns one-tile slabs
-
NotImplementedError
-
Raster pyramid format not handled
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
FormatError
-
Cannot decode tile
-
-

Examples

-

FILE stored DTM (raster) pyramid, to get the altitude value at a point in the best level

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("/data/pyramids/RGEALTI.json")
-    level, col, row, pcol, prow = pyramid.get_tile_indices(44, 5, srs = "EPSG:4326")
-    data = pyramid.get_tile_data_raster(level, col, row)
-
-    if data is None:
-        print("No data")
-    else:
-        print(data[prow][pcol])
-
-except Exception as e:
-    print("Cannot get a pyramid's pixel value : {e}")
-
-

Returns

-
-
str
-
data, as numpy array, None if no data
-
-
-
-def get_tile_data_vector(self, level: str, column: int, row: int) ‑> Dict[~KT, ~VT] -
-
-

Get a vector pyramid's tile as GeoJSON dictionnary

-

Args

-
-
level : str
-
Tile's level
-
column : int
-
Tile's column
-
row : int
-
Tile's row
-
-

Raises

-
-
Exception
-
Cannot get vector data for a raster pyramid
-
Exception
-
Level not found in the pyramid
-
NotImplementedError
-
Pyramid owns one-tile slabs
-
NotImplementedError
-
Vector pyramid format not handled
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
FormatError
-
Cannot decode tile
-
-

Examples

-

S3 stored vector pyramid, to print a tile as GeoJSON

-
from rok4.pyramid import Pyramid
-
-import json
-
-try:
-    pyramid = Pyramid.from_descriptor("s3://pyramids/vectors/BDTOPO.json")
-    level, col, row, pcol, prow = pyramid.get_tile_indices(40.325, 3.123, srs = "EPSG:4326")
-    data = pyramid.get_tile_data_vector(level, col, row)
-
-    if data is None:
-        print("No data")
-    else:
-        print(json.dumps(data))
-
-except Exception as e:
-    print("Cannot print a vector pyramid's tile as GeoJSON : {e}")
-
-

Returns

-
-
str
-
data, as GeoJSON dictionnary. None if no data
-
-
-
-def get_tile_indices(self, x: float, y: float, level: str = None, **kwargs) ‑> Tuple[str, int, int, int, int] -
-
-

Get pyramid's tile and pixel indices from point's coordinates

-

Used coordinates system have to be the pyramid one. If EPSG:4326, x is latitude and y longitude.

-

Args

-
-
x : float
-
point's x
-
y : float
-
point's y
-
level : str, optional
-
Pyramid's level to take into account, the bottom one if None . Defaults to None.
-
**srs : string
-
spatial reference system of provided coordinates, with authority and code (same as the pyramid's one if not provided)
-
-

Raises

-
-
Exception
-
Cannot find level to calculate indices
-
RuntimeError
-
Provided SRS is invalid for OSR
-
-

Examples

-

FILE stored DTM (raster) pyramid, to get the altitude value at a point in the best level

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("/data/pyramids/RGEALTI.json")
-    level, col, row, pcol, prow = pyramid.get_tile_indices(44, 5, srs = "EPSG:4326")
-    data = pyramid.get_tile_data_raster(level, col, row)
-
-    if data is None:
-        print("No data")
-    else:
-        print(data[prow][pcol])
-
-except Exception as e:
-    print("Cannot get a pyramid's pixel value : {e}")
-
-

Returns

-
-
Tuple[str, int, int, int, int]
-
Level identifier, tile's column, tile's row, pixel's (in the tile) column, pixel's row
-
-
-
-def list_generator(self, level_id: str = None) ‑> Iterator[Tuple[Tuple[SlabType, str, int, int], Dict[~KT, ~VT]]] -
-
-

Get list content

-

List is copied as temporary file, roots are read and informations about each slab is returned. If list is already loaded, we yield the cached content -Args : -level_id (str) : id of the level for load only one level

-

Examples

-

S3 stored descriptor

-
from rok4.pyramid import Pyramid
-
-try:
-    pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json")
-
-    for (slab_type, level, column, row), infos in pyramid.list_generator():
-        print(infos)
-
-except Exception as e:
-    print("Cannot load the pyramid from its descriptor and read the list")
-
-

Yields

-
-
Iterator[Tuple[Tuple[SlabType,str,int,int], Dict]]
-
Slab indices and storage informations
-
-

Value example:

-
(
-    (<SlabType.DATA: 'DATA'>, '18', 5424, 7526),
-    {
-        'link': False,
-        'md5': None,
-        'root': 'pyramids@localhost:9000/LIMADM',
-        'slab': 'DATA_18_5424_7526'
-    }
-)
-
-

Raises

-
-
StorageError
-
Unhandled pyramid storage to copy list
-
MissingEnvironmentError
-
Missing object storage informations
-
-
-
-def load_list(self) ‑> int -
-
-

Load list content and cache it

-

If list is already loaded, nothing done

-
-
-def write_descriptor(self) ‑> None -
-
-

Write the pyramid's descriptor to the final location (in the pyramid's storage root)

-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/raster.html b/2.2.2/rok4/raster.html deleted file mode 100644 index 174bcbb..0000000 --- a/2.2.2/rok4/raster.html +++ /dev/null @@ -1,676 +0,0 @@ - - - - - - -rok4.raster API documentation - - - - - - - - - - - -
-
-
-

Module rok4.raster

-
-
-

Provide functions to read information on raster data

-

The module contains the following class :

-
    -
  • Raster - Structure describing raster data.
  • -
  • RasterSet - Structure describing a set of raster data.
  • -
-
-
-
-
-
-
-
-
-

Classes

-
-
-class Raster -
-
-

A structure describing raster data

-

Attributes

-
-
path : str
-
path to the file/object (ex: file:///path/to/image.tif or s3://bucket/path/to/image.tif)
-
bbox : Tuple[float, float, float, float]
-
bounding rectange in the data projection
-
bands : int
-
number of color bands (or channels) format (ColorFormat). Numeric variable format for color values. Bit depth, as bits per channel, -can be derived from it.
-
mask : str
-
path to the associated mask file or object, if any, or None (same path as the image, but with a ".msk" extension and TIFF format. -Ex: file:///path/to/image.msk or s3://bucket/path/to/image.msk)
-
dimensions : Tuple[int, int]
-
image width and height, in pixels
-
-
- -Expand source code - -
class Raster:
-    """A structure describing raster data
-
-    Attributes:
-        path (str): path to the file/object (ex: file:///path/to/image.tif or s3://bucket/path/to/image.tif)
-        bbox (Tuple[float, float, float, float]): bounding rectange in the data projection
-        bands (int): number of color bands (or channels) format (ColorFormat). Numeric variable format for color values. Bit depth, as bits per channel,
-            can be derived from it.
-        mask (str): path to the associated mask file or object, if any, or None (same path as the image, but with a ".msk" extension and TIFF format.
-            Ex: file:///path/to/image.msk or s3://bucket/path/to/image.msk)
-        dimensions (Tuple[int, int]): image width and height, in pixels
-    """
-
-    def __init__(self) -> None:
-        self.bands = None
-        self.bbox = (None, None, None, None)
-        self.dimensions = (None, None)
-        self.format = None
-        self.mask = None
-        self.path = None
-
-    @classmethod
-    def from_file(cls, path: str) -> "Raster":
-        """Creates a Raster object from an image
-
-        Args:
-            path (str): path to the image file/object
-
-        Examples:
-
-            Loading informations from a file stored raster TIFF image
-
-                from rok4.raster import Raster
-
-                try:
-                    raster = Raster.from_file(
-                        "file:///data/SC1000/0040_6150_L93.tif"
-                    )
-
-                except Exception as e:
-                    print(f"Cannot load information from image : {e}")
-
-        Raises:
-            FormatError: MASK file is not a TIFF
-            RuntimeError: raised by OGR/GDAL if anything goes wrong
-            NotImplementedError: Storage type not handled
-            FileNotFoundError: File or object does not exists
-
-        Returns:
-            Raster: a Raster instance
-        """
-        if not exists(path):
-            raise FileNotFoundError(f"No file or object found at path '{path}'.")
-
-        self = cls()
-
-        work_image_path = get_osgeo_path(path)
-
-        image_datasource = gdal.Open(work_image_path)
-        self.path = path
-
-        path_pattern = re.compile("(/[^/]+?)[.][a-zA-Z0-9_-]+$")
-        mask_path = path_pattern.sub("\\1.msk", path)
-
-        if exists(mask_path):
-            work_mask_path = get_osgeo_path(mask_path)
-            mask_driver = gdal.IdentifyDriver(work_mask_path).ShortName
-            if "GTiff" != mask_driver:
-                message = f"Mask file '{mask_path}' use GDAL driver : '{mask_driver}'"
-                raise FormatError("TIFF", mask_path, message)
-            self.mask = mask_path
-        else:
-            self.mask = None
-
-        self.bbox = compute_bbox(image_datasource)
-        self.bands = image_datasource.RasterCount
-        self.format = compute_format(image_datasource, path)
-        self.dimensions = (image_datasource.RasterXSize, image_datasource.RasterYSize)
-
-        return self
-
-    @classmethod
-    def from_parameters(
-        cls,
-        path: str,
-        bands: int,
-        bbox: Tuple[float, float, float, float],
-        dimensions: Tuple[int, int],
-        format: ColorFormat,
-        mask: str = None,
-    ) -> "Raster":
-        """Creates a Raster object from parameters
-
-        Args:
-            path (str): path to the file/object (ex: file:///path/to/image.tif or s3://bucket/image.tif)
-            bands (int): number of color bands (or channels)
-            bbox (Tuple[float, float, float, float]): bounding rectange in the data projection
-            dimensions (Tuple[int, int]): image width and height expressed in pixels
-            format (ColorFormat): numeric format for color values. Bit depth, as bits per channel, can be derived from it.
-            mask (str, optionnal): path to the associated mask, if any, or None (same path as the image, but with a ".msk"
-                extension and TIFF format. ex: file:///path/to/image.msk or s3://bucket/image.msk)
-
-        Examples:
-
-            Loading informations from parameters, related to
-              a TIFF main image coupled to a TIFF mask image
-
-                from rok4.raster import Raster
-
-                try:
-                    raster = Raster.from_parameters(
-                        path="file:///data/SC1000/_0040_6150_L93.tif",
-                        mask="file:///data/SC1000/0040_6150_L93.msk",
-                        bands=3,
-                        format=ColorFormat.UINT8,
-                        dimensions=(2000, 2000),
-                        bbox=(40000.000, 5950000.000, 240000.000, 6150000.000)
-                    )
-
-                except Exception as e:
-                    print(
-                      f"Cannot load information from parameters : {e}"
-                    )
-
-        Raises:
-            KeyError: a mandatory argument is missing
-
-        Returns:
-            Raster: a Raster instance
-        """
-        self = cls()
-
-        self.path = path
-        self.bands = bands
-        self.bbox = bbox
-        self.dimensions = dimensions
-        self.format = format
-        self.mask = mask
-        return self
-
-

Static methods

-
-
-def from_file(path: str) ‑> Raster -
-
-

Creates a Raster object from an image

-

Args

-
-
path : str
-
path to the image file/object
-
-

Examples

-

Loading informations from a file stored raster TIFF image

-
from rok4.raster import Raster
-
-try:
-    raster = Raster.from_file(
-        "file:///data/SC1000/0040_6150_L93.tif"
-    )
-
-except Exception as e:
-    print(f"Cannot load information from image : {e}")
-
-

Raises

-
-
FormatError
-
MASK file is not a TIFF
-
RuntimeError
-
raised by OGR/GDAL if anything goes wrong
-
NotImplementedError
-
Storage type not handled
-
FileNotFoundError
-
File or object does not exists
-
-

Returns

-
-
Raster
-
a Raster instance
-
-
-
-def from_parameters(path: str, bands: int, bbox: Tuple[float, float, float, float], dimensions: Tuple[int, int], format: ColorFormat, mask: str = None) ‑> Raster -
-
-

Creates a Raster object from parameters

-

Args

-
-
path : str
-
path to the file/object (ex: file:///path/to/image.tif or s3://bucket/image.tif)
-
bands : int
-
number of color bands (or channels)
-
bbox : Tuple[float, float, float, float]
-
bounding rectange in the data projection
-
dimensions : Tuple[int, int]
-
image width and height expressed in pixels
-
format : ColorFormat
-
numeric format for color values. Bit depth, as bits per channel, can be derived from it.
-
mask : str, optionnal
-
path to the associated mask, if any, or None (same path as the image, but with a ".msk" -extension and TIFF format. ex: file:///path/to/image.msk or s3://bucket/image.msk)
-
-

Examples

-

Loading informations from parameters, related to -a TIFF main image coupled to a TIFF mask image

-
from rok4.raster import Raster
-
-try:
-    raster = Raster.from_parameters(
-        path="file:///data/SC1000/_0040_6150_L93.tif",
-        mask="file:///data/SC1000/0040_6150_L93.msk",
-        bands=3,
-        format=ColorFormat.UINT8,
-        dimensions=(2000, 2000),
-        bbox=(40000.000, 5950000.000, 240000.000, 6150000.000)
-    )
-
-except Exception as e:
-    print(
-      f"Cannot load information from parameters : {e}"
-    )
-
-

Raises

-
-
KeyError
-
a mandatory argument is missing
-
-

Returns

-
-
Raster
-
a Raster instance
-
-
-
-
-
-class RasterSet -
-
-

A structure describing a set of raster data

-

Attributes

-
-
raster_list : List[Raster]
-
List of Raster instances in the set
-
colors : Set[Tuple[int, ColorFormat]]
-
Set (distinct values) of color properties (bands and format) found in the raster set.
-
srs : str
-
Name of the set's spatial reference system
-
bbox : Tuple[float, float, float, float]
-
bounding rectange in the data projection, enclosing the whole set
-
-
- -Expand source code - -
class RasterSet:
-    """A structure describing a set of raster data
-
-    Attributes:
-        raster_list (List[Raster]): List of Raster instances in the set
-        colors (Set[Tuple[int, ColorFormat]]): Set (distinct values) of color properties (bands and format) found in the raster set.
-        srs (str): Name of the set's spatial reference system
-        bbox (Tuple[float, float, float, float]): bounding rectange in the data projection, enclosing the whole set
-    """
-
-    def __init__(self) -> None:
-        self.bbox = (None, None, None, None)
-        self.colors = set()
-        self.raster_list = []
-        self.srs = None
-
-    @classmethod
-    def from_list(cls, path: str, srs: str) -> "RasterSet":
-        """Instanciate a RasterSet from an images list path and a srs
-
-        Args:
-            path (str): path to the images list file or object (each line in this list contains the path to an image file or object in the set)
-            srs (str): images' coordinates system
-
-        Examples:
-
-            Loading informations from a file stored list
-
-                from rok4.raster import RasterSet
-
-                try:
-                    raster_set = RasterSet.from_list(
-                        path="file:///data/SC1000.list",
-                        srs="EPSG:3857"
-                    )
-
-                except Exception as e:
-                    print(
-                        f"Cannot load information from list file : {e}"
-                    )
-
-        Raises:
-            RuntimeError: raised by OGR/GDAL if anything goes wrong
-            NotImplementedError: Storage type not handled
-
-        Returns:
-            RasterSet: a RasterSet instance
-        """
-        self = cls()
-        self.srs = srs
-
-        # Chargement de la liste des images (la liste peut être un fichier ou un objet)
-        list_obj = tempfile.NamedTemporaryFile(mode="r", delete=False)
-        list_file = list_obj.name
-        copy(path, f"file://{list_file}")
-        list_obj.close()
-        image_list = []
-        with open(list_file) as listin:
-            for line in listin:
-                image_path = line.strip(" \t\n\r")
-                image_list.append(image_path)
-
-        remove(f"file://{list_file}")
-
-        bbox = [None, None, None, None]
-        for image_path in image_list:
-            raster = Raster.from_file(image_path)
-            self.raster_list.append(raster)
-
-            # Mise à jour de la bbox globale
-            if bbox == [None, None, None, None]:
-                bbox = list(raster.bbox)
-            else:
-                if bbox[0] > raster.bbox[0]:
-                    bbox[0] = raster.bbox[0]
-                if bbox[1] > raster.bbox[1]:
-                    bbox[1] = raster.bbox[1]
-                if bbox[2] < raster.bbox[2]:
-                    bbox[2] = raster.bbox[2]
-                if bbox[3] < raster.bbox[3]:
-                    bbox[3] = raster.bbox[3]
-
-            # Inventaire des colors distinctes
-            self.colors.add((raster.bands, raster.format))
-
-        self.bbox = tuple(bbox)
-
-        return self
-
-    @classmethod
-    def from_descriptor(cls, path: str) -> "RasterSet":
-        """Creates a RasterSet object from a descriptor file or object
-
-        Args:
-            path (str): path to the descriptor file or object
-
-        Examples:
-
-            Loading informations from a file stored descriptor
-
-                from rok4.raster import RasterSet
-
-                try:
-                    raster_set = RasterSet.from_descriptor(
-                        "file:///data/images/descriptor.json"
-                    )
-
-                except Exception as e:
-                    message = ("Cannot load information from descriptor file : {e}")
-                    print(message)
-
-        Raises:
-            RuntimeError: raised by OGR/GDAL if anything goes wrong
-            NotImplementedError: Storage type not handled
-
-        Returns:
-            RasterSet: a RasterSet instance
-        """
-        self = cls()
-
-        try:
-            serialization = json.loads(get_data_str(path))
-
-        except JSONDecodeError as e:
-            raise FormatError("JSON", path, e)
-
-        self.srs = serialization["srs"]
-        self.raster_list = []
-        for raster_dict in serialization["raster_list"]:
-            parameters = deepcopy(raster_dict)
-            parameters["bbox"] = tuple(raster_dict["bbox"])
-            parameters["dimensions"] = tuple(raster_dict["dimensions"])
-            parameters["format"] = ColorFormat[raster_dict["format"]]
-            self.raster_list.append(Raster.from_parameters(**parameters))
-
-        self.bbox = tuple(serialization["bbox"])
-        for color_dict in serialization["colors"]:
-            self.colors.add((color_dict["bands"], ColorFormat[color_dict["format"]]))
-
-        return self
-
-    @property
-    def serializable(self) -> Dict:
-        """Get the dict version of the raster set, descriptor compliant
-
-        Returns:
-            Dict: descriptor structured object description
-        """
-        serialization = {"bbox": list(self.bbox), "srs": self.srs, "colors": [], "raster_list": []}
-        for color in self.colors:
-            color_serial = {"bands": color[0], "format": color[1].name}
-            serialization["colors"].append(color_serial)
-        for raster in self.raster_list:
-            raster_dict = {
-                "path": raster.path,
-                "dimensions": list(raster.dimensions),
-                "bbox": list(raster.bbox),
-                "bands": raster.bands,
-                "format": raster.format.name,
-            }
-            if raster.mask is not None:
-                raster_dict["mask"] = raster.mask
-            serialization["raster_list"].append(raster_dict)
-
-        return serialization
-
-    def write_descriptor(self, path: str = None) -> None:
-        """Print raster set's descriptor as JSON
-
-        Args:
-            path (str, optional): Complete path (file or object) where to print the raster set's JSON. Defaults to None, JSON is printed to standard output.
-        """
-        content = json.dumps(self.serializable, sort_keys=True)
-        if path is None:
-            print(content)
-        else:
-            put_data_str(content, path)
-
-

Static methods

-
-
-def from_descriptor(path: str) ‑> RasterSet -
-
-

Creates a RasterSet object from a descriptor file or object

-

Args

-
-
path : str
-
path to the descriptor file or object
-
-

Examples

-

Loading informations from a file stored descriptor

-
from rok4.raster import RasterSet
-
-try:
-    raster_set = RasterSet.from_descriptor(
-        "file:///data/images/descriptor.json"
-    )
-
-except Exception as e:
-    message = ("Cannot load information from descriptor file : {e}")
-    print(message)
-
-

Raises

-
-
RuntimeError
-
raised by OGR/GDAL if anything goes wrong
-
NotImplementedError
-
Storage type not handled
-
-

Returns

-
-
RasterSet
-
a RasterSet instance
-
-
-
-def from_list(path: str, srs: str) ‑> RasterSet -
-
-

Instanciate a RasterSet from an images list path and a srs

-

Args

-
-
path : str
-
path to the images list file or object (each line in this list contains the path to an image file or object in the set)
-
srs : str
-
images' coordinates system
-
-

Examples

-

Loading informations from a file stored list

-
from rok4.raster import RasterSet
-
-try:
-    raster_set = RasterSet.from_list(
-        path="file:///data/SC1000.list",
-        srs="EPSG:3857"
-    )
-
-except Exception as e:
-    print(
-        f"Cannot load information from list file : {e}"
-    )
-
-

Raises

-
-
RuntimeError
-
raised by OGR/GDAL if anything goes wrong
-
NotImplementedError
-
Storage type not handled
-
-

Returns

-
-
RasterSet
-
a RasterSet instance
-
-
-
-

Instance variables

-
-
prop serializable : Dict[~KT, ~VT]
-
-

Get the dict version of the raster set, descriptor compliant

-

Returns

-
-
Dict
-
descriptor structured object description
-
-
- -Expand source code - -
@property
-def serializable(self) -> Dict:
-    """Get the dict version of the raster set, descriptor compliant
-
-    Returns:
-        Dict: descriptor structured object description
-    """
-    serialization = {"bbox": list(self.bbox), "srs": self.srs, "colors": [], "raster_list": []}
-    for color in self.colors:
-        color_serial = {"bands": color[0], "format": color[1].name}
-        serialization["colors"].append(color_serial)
-    for raster in self.raster_list:
-        raster_dict = {
-            "path": raster.path,
-            "dimensions": list(raster.dimensions),
-            "bbox": list(raster.bbox),
-            "bands": raster.bands,
-            "format": raster.format.name,
-        }
-        if raster.mask is not None:
-            raster_dict["mask"] = raster.mask
-        serialization["raster_list"].append(raster_dict)
-
-    return serialization
-
-
-
-

Methods

-
-
-def write_descriptor(self, path: str = None) ‑> None -
-
-

Print raster set's descriptor as JSON

-

Args

-
-
path : str, optional
-
Complete path (file or object) where to print the raster set's JSON. Defaults to None, JSON is printed to standard output.
-
-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/storage.html b/2.2.2/rok4/storage.html deleted file mode 100644 index 5720a22..0000000 --- a/2.2.2/rok4/storage.html +++ /dev/null @@ -1,423 +0,0 @@ - - - - - - -rok4.storage API documentation - - - - - - - - - - - -
-
-
-

Module rok4.storage

-
-
-

Provide functions to read or write data

-

Available storage types are :

-
    -
  • S3 (path are preffixed with s3://)
  • -
  • CEPH (path are prefixed with ceph://)
  • -
  • FILE (path are prefixed with file://, but it is the default paths' interpretation)
  • -
  • HTTP (path are prefixed with http://)
  • -
  • HTTPS (path are prefixed with https://)
  • -
-

According to functions, all storage types are not necessarily available.

-

Readings uses a LRU cache system with a TTL. It's possible to configure it with environment variables :

-
    -
  • ROK4_READING_LRU_CACHE_SIZE : Number of cached element. Default 64. Set 0 or a negative integer to configure a cache without bound. A power of two make cache more efficient.
  • -
  • ROK4_READING_LRU_CACHE_TTL : Validity duration of cached element, in seconds. Default 300. 0 or negative integer to get cache without expiration date.
  • -
-

To disable cache (always read data on storage), set ROK4_READING_LRU_CACHE_SIZE to 1 and ROK4_READING_LRU_CACHE_TTL to 1.

-

Using CEPH storage requires environment variables :

-
    -
  • ROK4_CEPH_CONFFILE
  • -
  • ROK4_CEPH_USERNAME
  • -
  • ROK4_CEPH_CLUSTERNAME
  • -
-

Using S3 storage requires environment variables :

-
    -
  • ROK4_S3_KEY
  • -
  • ROK4_S3_SECRETKEY
  • -
  • ROK4_S3_URL
  • -
-

To use several S3 clusters, each environment variable have to contain a list (comma-separated), with the same number of elements

-

Example, work with 2 S3 clusters:

- -

To precise the cluster to use, bucket name should be bucket_name@s3.storage.fr or bucket_name@s4.storage.fr. If no host is defined (no @) in the bucket name, first S3 cluster is used

-
-
-
-
-
-
-

Functions

-
-
-def copy(from_path: str, to_path: str, from_md5: str = None) ‑> None -
-
-

Copy a file or object to a file or object place. If MD5 sum is provided, it is compared to sum after the copy.

-

Args

-
-
from_path : str
-
source file/object path, to copy
-
to_path : str
-
destination file/object path
-
from_md5 : str, optional
-
MD5 sum, re-processed after copy and controlled. Defaults to None.
-
-

Raises

-
-
StorageError
-
Copy issue
-
MissingEnvironmentError
-
Missing object storage informations
-
NotImplementedError
-
Storage type not handled
-
-
-
-def disconnect_ceph_clients() ‑> None -
-
-

Clean CEPH clients

-
-
-def disconnect_s3_clients() ‑> None -
-
-

Clean S3 clients

-
-
-def exists(path: str) ‑> bool -
-
-

Do the file or object exist ?

-

Args

-
-
path : str
-
path of file/object to test
-
-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
NotImplementedError
-
Storage type not handled
-
-

Returns

-
-
bool
-
file/object existing status
-
-
-
-def get_data_binary(path: str, range: Tuple[int, int] = None) ‑> str -
-
-

Load data into a binary string

-

This function uses a LRU cache, with a TTL of 5 minutes

-

Args

-
-
path : str
-
path to data
-
range : Tuple[int, int], optional
-
offset and size, to make a partial read. Defaults to None.
-
-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
FileNotFoundError
-
File or object does not exist
-
NotImplementedError
-
Storage type not handled
-
-

Returns

-
-
str
-
Data binary content
-
-
-
-def get_data_str(path: str) ‑> str -
-
-

Load full data into a string

-

Args

-
-
path : str
-
path to data
-
-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
FileNotFoundError
-
File or object does not exist
-
NotImplementedError
-
Storage type not handled
-
-

Returns

-
-
str
-
Data content
-
-
-
-def get_infos_from_path(path: str) ‑> Tuple[StorageType, str, str, str] -
-
-

Extract storage type, the unprefixed path, the container and the basename from path (Default: FILE storage)

-

For a FILE storage, the tray is the directory and the basename is the file name.

-

For an object storage (CEPH or S3), the tray is the bucket or the pool and the basename is the object name. -For a S3 bucket, format can be @ to use several clusters. Cluster name is the host (without protocol)

-

Args

-
-
path : str
-
path to analyse
-
-

Returns

-
-
Tuple[StorageType, str, str, str]
-
storage type, unprefixed path, the container and the basename
-
-
-
-def get_osgeo_path(path: str) ‑> str -
-
-

Return GDAL/OGR Open compliant path and configure storage access

-

For a S3 input path, endpoint, access and secret keys are set and path is built with "/vsis3" root.

-

For a FILE input path, only storage prefix is removed

-

Args

-
-
path : str
-
Source path
-
-

Raises

-
-
NotImplementedError
-
Storage type not handled
-
-

Returns

-
-
str
-
GDAL/OGR Open compliant path
-
-
-
-def get_path_from_infos(storage_type: StorageType, *args) ‑> str -
-
-

Write full path from elements

-

Prefixed wih storage's type, elements are joined with a slash

-

Args

-
-
storage_type : StorageType
-
Storage's type for path
-
-

Returns

-
-
str
-
Full path
-
-
-
-def get_size(path: str) ‑> int -
-
-

Get size of file or object

-

Args

-
-
path : str
-
path of file/object whom size is asked
-
-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
NotImplementedError
-
Storage type not handled
-
-

Returns

-
-
int
-
file/object size, in bytes
-
-
-
-def hash_file(path: str) ‑> str -
-
-

Process MD5 sum of the provided file

-

Args

-
-
path : str
-
path to file
-
-

Returns

-
-
str
-
hexadeimal MD5 sum
-
-
- -
-

Create a symbolic link

-

Args

-
-
target_path : str
-
file/object to link
-
link_path : str
-
link to create
-
hard : bool, optional
-
hard link rather than symbolic. Only for FILE storage. Defaults to False.
-
-

Raises

-
-
StorageError
-
link issue
-
MissingEnvironmentError
-
Missing object storage informations
-
NotImplementedError
-
Storage type not handled
-
-
-
-def put_data_str(data: str, path: str) ‑> None -
-
-

Store string data into a file or an object

-

UTF-8 encoding is used for bytes conversion

-

Args

-
-
data : str
-
data to write
-
path : str
-
destination path, where to write data
-
-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage write issue
-
NotImplementedError
-
Storage type not handled
-
-
-
-def remove(path: str) ‑> None -
-
-

Remove the file/object

-

Args

-
-
path : str
-
path of file/object to remove
-
-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage removal issue
-
NotImplementedError
-
Storage type not handled
-
-
-
-def size_path(path: str) ‑> int -
-
-

Return the size of the given path (or, for the CEPH, the sum of the size of each object of the .list)

-

Args

-
-
path : str
-
Source path
-
-

Raises

-
-
StorageError
-
Unhandled link or link issue
-
MissingEnvironmentError
-
Missing object storage informations
-
NotImplementedError
-
Storage type not handled
-
-

Returns

-
-
int
-
size of the path
-
-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/style.html b/2.2.2/rok4/style.html deleted file mode 100644 index ff8b157..0000000 --- a/2.2.2/rok4/style.html +++ /dev/null @@ -1,1127 +0,0 @@ - - - - - - -rok4.style API documentation - - - - - - - - - - - -
-
-
-

Module rok4.style

-
-
-

Provide classes to use a ROK4 style.

-

The module contains the following classe:

-
    -
  • Style - Style descriptor, to convert raster data
  • -
-

Loading a style requires environment variables :

-
    -
  • ROK4_STYLES_DIRECTORY
  • -
-
-
-
-
-
-
-
-
-

Classes

-
-
-class Colour -(palette: Dict[~KT, ~VT], style: Style) -
-
-

A palette's RGBA colour.

-

Attributes

-
-
value : float
-
Value to convert to RGBA
-
red : int
-
Red value (from 0 to 255)
-
green : int
-
Green value (from 0 to 255)
-
blue : int
-
Blue value (from 0 to 255)
-
alpha : int
-
Alpha value (from 0 to 255)
-
-

Constructor method

-

Args

-
-
colour
-
Colour attributes, according to JSON structure
-
style
-
Style object containing the palette's colour to create
-
-

Examples

-

JSON colour section

-
{
-    "value": 600,
-    "red": 220,
-    "green": 179,
-    "blue": 99,
-    "alpha": 255
-}
-
-

Raises

-
-
MissingAttributeError
-
Attribute is missing in the content
-
Exception
-
Invalid colour's band
-
-
- -Expand source code - -
class Colour:
-    """A palette's RGBA colour.
-
-    Attributes:
-        value (float): Value to convert to RGBA
-        red (int): Red value (from 0 to 255)
-        green (int): Green value (from 0 to 255)
-        blue (int): Blue value (from 0 to 255)
-        alpha (int): Alpha value (from 0 to 255)
-    """
-
-    def __init__(self, palette: Dict, style: "Style") -> None:
-        """Constructor method
-
-        Args:
-            colour: Colour attributes, according to JSON structure
-            style: Style object containing the palette's colour to create
-
-        Examples:
-
-            JSON colour section
-
-                {
-                    "value": 600,
-                    "red": 220,
-                    "green": 179,
-                    "blue": 99,
-                    "alpha": 255
-                }
-
-        Raises:
-            MissingAttributeError: Attribute is missing in the content
-            Exception: Invalid colour's band
-        """
-
-        try:
-            self.value = palette["value"]
-
-            self.red = palette["red"]
-            if self.red < 0 or self.red > 255:
-                raise Exception(
-                    f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)"
-                )
-            self.green = palette["green"]
-            if self.green < 0 or self.green > 255:
-                raise Exception(
-                    f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)"
-                )
-            self.blue = palette["blue"]
-            if self.blue < 0 or self.blue > 255:
-                raise Exception(
-                    f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)"
-                )
-            self.alpha = palette["alpha"]
-            if self.alpha < 0 or self.alpha > 255:
-                raise Exception(
-                    f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)"
-                )
-
-        except KeyError as e:
-            raise MissingAttributeError(style.path, f"palette.colours[].{e}")
-
-        except TypeError:
-            raise Exception(
-                f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)"
-            )
-
-    @property
-    def rgba(self) -> Tuple[int]:
-        return (self.red, self.green, self.blue, self.alpha)
-
-    @property
-    def rgb(self) -> Tuple[int]:
-        return (self.red, self.green, self.blue)
-
-

Instance variables

-
-
prop rgb : Tuple[int]
-
-
-
- -Expand source code - -
@property
-def rgb(self) -> Tuple[int]:
-    return (self.red, self.green, self.blue)
-
-
-
prop rgba : Tuple[int]
-
-
-
- -Expand source code - -
@property
-def rgba(self) -> Tuple[int]:
-    return (self.red, self.green, self.blue, self.alpha)
-
-
-
-
-
-class Estompage -(estompage: Dict[~KT, ~VT], style: Style) -
-
-

A style's estompage parameters.

-

Attributes

-
-
zenith : float
-
Sun's zenith in degree
-
azimuth : float
-
Sun's azimuth in degree
-
z_factor : int
-
Slope exaggeration factor
-
image_nodata : float
-
Nodata input value
-
estompage_nodata : float
-
Nodata estompage value
-
-

Constructor method

-

Args

-
-
estompage
-
Estompage attributes, according to JSON structure
-
style
-
Style object containing the estompage to create
-
-

Examples

-

JSON estompage section

-
{
-    "zenith": 45,
-    "azimuth": 315,
-    "z_factor": 1
-}
-
-

Raises

-
-
MissingAttributeError
-
Attribute is missing in the content
-
-
- -Expand source code - -
class Estompage:
-    """A style's estompage parameters.
-
-    Attributes:
-        zenith (float): Sun's zenith in degree
-        azimuth (float): Sun's azimuth in degree
-        z_factor (int): Slope exaggeration factor
-        image_nodata (float): Nodata input value
-        estompage_nodata (float): Nodata estompage value
-    """
-
-    def __init__(self, estompage: Dict, style: "Style") -> None:
-        """Constructor method
-
-        Args:
-            estompage: Estompage attributes, according to JSON structure
-            style: Style object containing the estompage to create
-
-        Examples:
-
-            JSON estompage section
-
-                {
-                    "zenith": 45,
-                    "azimuth": 315,
-                    "z_factor": 1
-                }
-
-        Raises:
-            MissingAttributeError: Attribute is missing in the content
-        """
-
-        try:
-            # azimuth et azimuth sont converti en leur complémentaire en radian
-            self.zenith = (90.0 - estompage.get("zenith", 45)) * DEG_TO_RAD
-            self.azimuth = (360.0 - estompage.get("azimuth", 315)) * DEG_TO_RAD
-            self.z_factor = estompage.get("z_factor", 1)
-            self.image_nodata = estompage.get("image_nodata", -99999.0)
-            self.estompage_nodata = estompage.get("estompage_nodata", 0.0)
-        except KeyError as e:
-            raise MissingAttributeError(style.path, f"estompage.{e}")
-
-
-
-class Exposition -(exposition: Dict[~KT, ~VT], style: Style) -
-
-

A style's exposition parameters.

-

Attributes

-
-
algo : str
-
Slope calculation algorithm chosen by the user ("H" for Horn)
-
min_slope : int
-
Slope from which exposition is computed
-
image_nodata : float
-
Nodata input value
-
exposition_nodata : float
-
Nodata exposition value
-
-

Constructor method

-

Args

-
-
exposition
-
Exposition attributes, according to JSON structure
-
style
-
Style object containing the exposition to create
-
-

Examples

-

JSON exposition section

-
{
-    "algo": "H",
-    "min_slope": 1
-}
-
-

Raises

-
-
MissingAttributeError
-
Attribute is missing in the content
-
-
- -Expand source code - -
class Exposition:
-    """A style's exposition parameters.
-
-    Attributes:
-        algo (str): Slope calculation algorithm chosen by the user ("H" for Horn)
-        min_slope (int): Slope from which exposition is computed
-        image_nodata (float): Nodata input value
-        exposition_nodata (float): Nodata exposition value
-    """
-
-    def __init__(self, exposition: Dict, style: "Style") -> None:
-        """Constructor method
-
-        Args:
-            exposition: Exposition attributes, according to JSON structure
-            style: Style object containing the exposition to create
-
-        Examples:
-
-            JSON exposition section
-
-                {
-                    "algo": "H",
-                    "min_slope": 1
-                }
-
-        Raises:
-            MissingAttributeError: Attribute is missing in the content
-        """
-
-        try:
-            self.algo = exposition.get("algo", "H")
-            self.min_slope = exposition.get("min_slope", 1.0) * DEG_TO_RAD
-            self.image_nodata = exposition.get("min_slope", -99999)
-            self.exposition_nodata = exposition.get("aspect_nodata", -1)
-        except KeyError as e:
-            raise MissingAttributeError(style.path, f"exposition.{e}")
-
-
-
-class Legend -(legend: Dict[~KT, ~VT], style: Style) -
-
-

A style's legend.

-

Attributes

-
-
format : str
-
Legend image's mime type
-
url : str
-
Legend image's url
-
height : int
-
Legend image's pixel height
-
width : int
-
Legend image's pixel width
-
min_scale_denominator : int
-
Minimum scale at which the legend is applicable
-
max_scale_denominator : int
-
Maximum scale at which the legend is applicable
-
-

Constructor method

-

Args

-
-
legend
-
Legend attributes, according to JSON structure
-
style
-
Style object containing the legend to create
-
-

Examples

-

JSON legend section

-
{
-    "format": "image/png",
-    "url": "http://ign.fr",
-    "height": 100,
-    "width": 100,
-    "min_scale_denominator": 0,
-    "max_scale_denominator": 30
-}
-
-

Raises

-
-
MissingAttributeError
-
Attribute is missing in the content
-
-
- -Expand source code - -
class Legend:
-    """A style's legend.
-
-    Attributes:
-        format (str): Legend image's mime type
-        url (str): Legend image's url
-        height (int): Legend image's pixel height
-        width (int): Legend image's pixel width
-        min_scale_denominator (int): Minimum scale at which the legend is applicable
-        max_scale_denominator (int): Maximum scale at which the legend is applicable
-    """
-
-    def __init__(self, legend: Dict, style: "Style") -> None:
-        """Constructor method
-
-        Args:
-            legend: Legend attributes, according to JSON structure
-            style: Style object containing the legend to create
-
-        Examples:
-
-            JSON legend section
-
-                {
-                    "format": "image/png",
-                    "url": "http://ign.fr",
-                    "height": 100,
-                    "width": 100,
-                    "min_scale_denominator": 0,
-                    "max_scale_denominator": 30
-                }
-
-        Raises:
-            MissingAttributeError: Attribute is missing in the content
-        """
-
-        try:
-            self.format = legend["format"]
-            self.url = legend["url"]
-            self.height = legend["height"]
-            self.width = legend["width"]
-            self.min_scale_denominator = legend["min_scale_denominator"]
-            self.max_scale_denominator = legend["max_scale_denominator"]
-        except KeyError as e:
-            raise MissingAttributeError(style.path, f"legend.{e}")
-
-
-
-class Palette -(palette: Dict[~KT, ~VT], style: Style) -
-
-

A style's RGBA palette.

-

Attributes

-
-
no_alpha : bool
-
Colour without alpha band
-
rgb_continuous : bool
-
Continuous RGB values ?
-
alpha_continuous : bool
-
Continuous alpha values ?
-
colours : List[Colour]
-
Palette's colours, input values ascending
-
-

Constructor method

-

Args

-
-
palette
-
Palette attributes, according to JSON structure
-
style
-
Style object containing the palette to create
-
-

Examples

-

JSON palette section

-
{
-    "no_alpha": false,
-    "rgb_continuous": true,
-    "alpha_continuous": true,
-    "colours": [
-        { "value": -99999, "red": 255, "green": 255, "blue": 255, "alpha": 0 },
-        { "value": -99998.1, "red": 255, "green": 255, "blue": 255, "alpha": 0 },
-        { "value": -99998.0, "red": 255, "green": 0, "blue": 255, "alpha": 255 },
-        { "value": -501, "red": 255, "green": 0, "blue": 255, "alpha": 255 },
-        { "value": -500, "red": 1, "green": 29, "blue": 148, "alpha": 255 },
-        { "value": -15, "red": 19, "green": 42, "blue": 255, "alpha": 255 },
-        { "value": 0, "red": 67, "green": 105, "blue": 227, "alpha": 255 },
-        { "value": 0.01, "red": 57, "green": 151, "blue": 105, "alpha": 255 },
-        { "value": 300, "red": 230, "green": 230, "blue": 128, "alpha": 255 },
-        { "value": 600, "red": 220, "green": 179, "blue": 99, "alpha": 255 },
-        { "value": 2000, "red": 162, "green": 100, "blue": 51, "alpha": 255 },
-        { "value": 2500, "red": 122, "green": 81, "blue": 40, "alpha": 255 },
-        { "value": 3000, "red": 255, "green": 255, "blue": 255, "alpha": 255 },
-        { "value": 9000, "red": 255, "green": 255, "blue": 255, "alpha": 255 },
-        { "value": 9001, "red": 255, "green": 255, "blue": 255, "alpha": 255 }
-    ]
-}
-
-

Raises

-
-
MissingAttributeError
-
Attribute is missing in the content
-
Exception
-
No colour in the palette or invalid colour
-
-
- -Expand source code - -
class Palette:
-    """A style's RGBA palette.
-
-    Attributes:
-        no_alpha (bool): Colour without alpha band
-        rgb_continuous (bool): Continuous RGB values ?
-        alpha_continuous (bool): Continuous alpha values ?
-        colours (List[Colour]): Palette's colours, input values ascending
-    """
-
-    def __init__(self, palette: Dict, style: "Style") -> None:
-        """Constructor method
-
-        Args:
-            palette: Palette attributes, according to JSON structure
-            style: Style object containing the palette to create
-
-        Examples:
-
-            JSON palette section
-
-                {
-                    "no_alpha": false,
-                    "rgb_continuous": true,
-                    "alpha_continuous": true,
-                    "colours": [
-                        { "value": -99999, "red": 255, "green": 255, "blue": 255, "alpha": 0 },
-                        { "value": -99998.1, "red": 255, "green": 255, "blue": 255, "alpha": 0 },
-                        { "value": -99998.0, "red": 255, "green": 0, "blue": 255, "alpha": 255 },
-                        { "value": -501, "red": 255, "green": 0, "blue": 255, "alpha": 255 },
-                        { "value": -500, "red": 1, "green": 29, "blue": 148, "alpha": 255 },
-                        { "value": -15, "red": 19, "green": 42, "blue": 255, "alpha": 255 },
-                        { "value": 0, "red": 67, "green": 105, "blue": 227, "alpha": 255 },
-                        { "value": 0.01, "red": 57, "green": 151, "blue": 105, "alpha": 255 },
-                        { "value": 300, "red": 230, "green": 230, "blue": 128, "alpha": 255 },
-                        { "value": 600, "red": 220, "green": 179, "blue": 99, "alpha": 255 },
-                        { "value": 2000, "red": 162, "green": 100, "blue": 51, "alpha": 255 },
-                        { "value": 2500, "red": 122, "green": 81, "blue": 40, "alpha": 255 },
-                        { "value": 3000, "red": 255, "green": 255, "blue": 255, "alpha": 255 },
-                        { "value": 9000, "red": 255, "green": 255, "blue": 255, "alpha": 255 },
-                        { "value": 9001, "red": 255, "green": 255, "blue": 255, "alpha": 255 }
-                    ]
-                }
-
-        Raises:
-            MissingAttributeError: Attribute is missing in the content
-            Exception: No colour in the palette or invalid colour
-        """
-
-        try:
-            self.no_alpha = palette["no_alpha"]
-            self.rgb_continuous = palette["rgb_continuous"]
-            self.alpha_continuous = palette["alpha_continuous"]
-
-            self.colours = []
-            for colour in palette["colours"]:
-                self.colours.append(Colour(colour, style))
-                if len(self.colours) >= 2 and self.colours[-1].value <= self.colours[-2].value:
-                    raise Exception(
-                        f"Style '{style.path}' palette colours hav eto be ordered input value ascending"
-                    )
-
-            if len(self.colours) == 0:
-                raise Exception(f"Style '{style.path}' palette has no colour")
-
-        except KeyError as e:
-            raise MissingAttributeError(style.path, f"palette.{e}")
-
-    def convert(self, value: float) -> Tuple[int]:
-
-        # Les couleurs dans la palette sont rangées par valeur croissante
-        # On commence par gérer les cas où la valeur est en dehors de la palette
-
-        if value <= self.colours[0].value:
-            if self.no_alpha:
-                return self.colours[0].rgb
-            else:
-                return self.colours[0].rgba
-
-        if value >= self.colours[-1].value:
-            if self.no_alpha:
-                return self.colours[-1].rgb
-            else:
-                return self.colours[-1].rgba
-
-        # On va maintenant chercher les deux couleurs entre lesquelles la valeur est
-        for i in range(1, len(self.colours)):
-            if self.colours[i].value < value:
-                continue
-
-            # on est sur la première couleur de valeur supérieure
-            colour_inf = self.colours[i - 1]
-            colour_sup = self.colours[i]
-            break
-
-        ratio = (value - colour_inf.value) / (colour_sup.value - colour_inf.value)
-        if self.rgb_continuous:
-            pixel = (
-                colour_inf.red + ratio * (colour_sup.red - colour_inf.red),
-                colour_inf.green + ratio * (colour_sup.green - colour_inf.green),
-                colour_inf.blue + ratio * (colour_sup.blue - colour_inf.blue),
-            )
-        else:
-            pixel = (colour_inf.red, colour_inf.green, colour_inf.blue)
-
-        if self.no_alpha:
-            return pixel
-        else:
-            if self.alpha_continuous:
-                return pixel + (colour_inf.alpha + ratio * (colour_sup.alpha - colour_inf.alpha),)
-            else:
-                return pixel + (colour_inf.alpha,)
-
-

Methods

-
-
-def convert(self, value: float) ‑> Tuple[int] -
-
-
-
-
-
-
-class Slope -(slope: Dict[~KT, ~VT], style: Style) -
-
-

A style's slope parameters.

-

Attributes

-
-
algo : str
-
Slope calculation algorithm chosen by the user ("H" for Horn)
-
unit : str
-
Slope unit
-
image_nodata : float
-
Nodata input value
-
slope_nodata : float
-
Nodata slope value
-
slope_max : float
-
Maximum value for the slope
-
-

Constructor method

-

Args

-
-
slope
-
Slope attributes, according to JSON structure
-
style
-
Style object containing the slope to create
-
-

Examples

-

JSON pente section

-
{
-    "algo": "H",
-    "unit": "degree",
-    "image_nodata": -99999,
-    "slope_nodata": 91,
-    "slope_max": 90
-}
-
-

Raises

-
-
MissingAttributeError
-
Attribute is missing in the content
-
-
- -Expand source code - -
class Slope:
-    """A style's slope parameters.
-
-    Attributes:
-        algo (str): Slope calculation algorithm chosen by the user ("H" for Horn)
-        unit (str): Slope unit
-        image_nodata (float): Nodata input value
-        slope_nodata (float): Nodata slope value
-        slope_max (float): Maximum value for the slope
-    """
-
-    def __init__(self, slope: Dict, style: "Style") -> None:
-        """Constructor method
-
-        Args:
-            slope: Slope attributes, according to JSON structure
-            style: Style object containing the slope to create
-
-        Examples:
-
-            JSON pente section
-
-                {
-                    "algo": "H",
-                    "unit": "degree",
-                    "image_nodata": -99999,
-                    "slope_nodata": 91,
-                    "slope_max": 90
-                }
-
-        Raises:
-            MissingAttributeError: Attribute is missing in the content
-        """
-
-        try:
-            self.algo = slope.get("algo", "H")
-            self.unit = slope.get("unit", "degree")
-            self.image_nodata = slope.get("image_nodata", -99999)
-            self.slope_nodata = slope.get("slope_nodata", 0)
-            self.slope_max = slope.get("slope_max", 90)
-        except KeyError as e:
-            raise MissingAttributeError(style.path, f"pente.{e}")
-
-
-
-class Style -(id: str) -
-
-

A raster data style

-

Attributes

-
-
path : str
-
TMS origin path (JSON)
-
id : str
-
Style's technical identifier
-
identifier : str
-
Style's public identifier
-
title : str
-
Style's title
-
abstract : str
-
Style's abstract
-
keywords : List[str]
-
Style's keywords
-
legend : Legend
-
Style's legend
-
palette : Palette
-
Style's palette, optionnal
-
estompage : Estompage
-
Style's estompage parameters, optionnal
-
slope : Slope
-
Style's slope parameters, optionnal
-
exposition : Exposition
-
Style's exposition parameters, optionnal
-
-

Constructor method

-

Style's directory is defined with environment variable ROK4_STYLES_DIRECTORY. Provided id is used as file/object name, with pr without JSON extension

-

Args

-
-
path
-
Style's id
-
-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
FileNotFoundError
-
Style file or object does not exist, with or without extension
-
FormatError
-
Provided path is not a well formed JSON
-
MissingAttributeError
-
Attribute is missing in the content
-
Exception
-
No colour in the palette or invalid colour
-
-
- -Expand source code - -
class Style:
-    """A raster data style
-
-    Attributes:
-        path (str): TMS origin path (JSON)
-        id (str): Style's technical identifier
-        identifier (str): Style's public identifier
-        title (str): Style's title
-        abstract (str): Style's abstract
-        keywords (List[str]): Style's keywords
-        legend (Legend): Style's legend
-
-        palette (Palette): Style's palette, optionnal
-        estompage (Estompage): Style's estompage parameters, optionnal
-        slope (Slope): Style's slope parameters, optionnal
-        exposition (Exposition): Style's exposition parameters, optionnal
-
-    """
-
-    def __init__(self, id: str) -> None:
-        """Constructor method
-
-        Style's directory is defined with environment variable ROK4_STYLES_DIRECTORY. Provided id is used as file/object name, with pr without JSON extension
-
-        Args:
-            path: Style's id
-
-        Raises:
-            MissingEnvironmentError: Missing object storage informations
-            StorageError: Storage read issue
-            FileNotFoundError: Style file or object does not exist, with or without extension
-            FormatError: Provided path is not a well formed JSON
-            MissingAttributeError: Attribute is missing in the content
-            Exception: No colour in the palette or invalid colour
-        """
-
-        self.id = id
-
-        try:
-            self.path = os.path.join(os.environ["ROK4_STYLES_DIRECTORY"], f"{self.id}")
-            if not exists(self.path):
-                self.path = os.path.join(os.environ["ROK4_STYLES_DIRECTORY"], f"{self.id}.json")
-                if not exists(self.path):
-                    raise FileNotFoundError(f"{self.path}, even without extension")
-        except KeyError as e:
-            raise MissingEnvironmentError(e)
-
-        try:
-            data = json.loads(get_data_str(self.path))
-
-            self.identifier = data["identifier"]
-            self.title = data["title"]
-            self.abstract = data["abstract"]
-            self.keywords = data["keywords"]
-
-            self.legend = Legend(data["legend"], self)
-
-            if "palette" in data:
-                self.palette = Palette(data["palette"], self)
-            else:
-                self.palette = None
-
-            if "estompage" in data:
-                self.estompage = Estompage(data["estompage"], self)
-            else:
-                self.estompage = None
-
-            if "pente" in data:
-                self.slope = Slope(data["pente"], self)
-            else:
-                self.slope = None
-
-            if "exposition" in data:
-                self.exposition = Exposition(data["exposition"], self)
-            else:
-                self.exposition = None
-
-        except JSONDecodeError as e:
-            raise FormatError("JSON", self.path, e)
-
-        except KeyError as e:
-            raise MissingAttributeError(self.path, e)
-
-    @property
-    def bands(self) -> int:
-        """Bands count after style application
-
-        Returns:
-            int: Bands count after style application, None if style is identity
-        """
-        if self.palette is not None:
-            if self.palette.no_alpha:
-                return 3
-            else:
-                return 4
-
-        elif self.estompage is not None or self.exposition is not None or self.slope is not None:
-            return 1
-
-        else:
-            return None
-
-    @property
-    def format(self) -> ColorFormat:
-        """Bands format after style application
-
-        Returns:
-            ColorFormat: Bands format after style application, None if style is identity
-        """
-        if self.palette is not None:
-            return ColorFormat.UINT8
-
-        elif self.estompage is not None or self.exposition is not None or self.slope is not None:
-            return ColorFormat.FLOAT32
-
-        else:
-            return None
-
-    @property
-    def input_nodata(self) -> float:
-        """Input nodata value
-
-        Returns:
-            float: Input nodata value, None if style is identity
-        """
-
-        if self.estompage is not None:
-            return self.estompage.image_nodata
-        elif self.exposition is not None:
-            return self.exposition.image_nodata
-        elif self.slope is not None:
-            return self.slope.image_nodata
-        elif self.palette is not None:
-            return self.palette.colours[0].value
-        else:
-            return None
-
-    @property
-    def is_identity(self) -> bool:
-        """Is style identity
-
-        Returns:
-            bool: Is style identity
-        """
-
-        return (
-            self.estompage is None
-            and self.exposition is None
-            and self.slope is None
-            and self.palette is None
-        )
-
-

Instance variables

-
-
prop bands : int
-
-

Bands count after style application

-

Returns

-
-
int
-
Bands count after style application, None if style is identity
-
-
- -Expand source code - -
@property
-def bands(self) -> int:
-    """Bands count after style application
-
-    Returns:
-        int: Bands count after style application, None if style is identity
-    """
-    if self.palette is not None:
-        if self.palette.no_alpha:
-            return 3
-        else:
-            return 4
-
-    elif self.estompage is not None or self.exposition is not None or self.slope is not None:
-        return 1
-
-    else:
-        return None
-
-
-
prop formatColorFormat
-
-

Bands format after style application

-

Returns

-
-
ColorFormat
-
Bands format after style application, None if style is identity
-
-
- -Expand source code - -
@property
-def format(self) -> ColorFormat:
-    """Bands format after style application
-
-    Returns:
-        ColorFormat: Bands format after style application, None if style is identity
-    """
-    if self.palette is not None:
-        return ColorFormat.UINT8
-
-    elif self.estompage is not None or self.exposition is not None or self.slope is not None:
-        return ColorFormat.FLOAT32
-
-    else:
-        return None
-
-
-
prop input_nodata : float
-
-

Input nodata value

-

Returns

-
-
float
-
Input nodata value, None if style is identity
-
-
- -Expand source code - -
@property
-def input_nodata(self) -> float:
-    """Input nodata value
-
-    Returns:
-        float: Input nodata value, None if style is identity
-    """
-
-    if self.estompage is not None:
-        return self.estompage.image_nodata
-    elif self.exposition is not None:
-        return self.exposition.image_nodata
-    elif self.slope is not None:
-        return self.slope.image_nodata
-    elif self.palette is not None:
-        return self.palette.colours[0].value
-    else:
-        return None
-
-
-
prop is_identity : bool
-
-

Is style identity

-

Returns

-
-
bool
-
Is style identity
-
-
- -Expand source code - -
@property
-def is_identity(self) -> bool:
-    """Is style identity
-
-    Returns:
-        bool: Is style identity
-    """
-
-    return (
-        self.estompage is None
-        and self.exposition is None
-        and self.slope is None
-        and self.palette is None
-    )
-
-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/tile_matrix_set.html b/2.2.2/rok4/tile_matrix_set.html deleted file mode 100644 index b88d750..0000000 --- a/2.2.2/rok4/tile_matrix_set.html +++ /dev/null @@ -1,565 +0,0 @@ - - - - - - -rok4.tile_matrix_set API documentation - - - - - - - - - - - -
-
-
-

Module rok4.tile_matrix_set

-
-
-

Provide classes to use a tile matrix set.

-

The module contains the following classes:

- -

Loading a tile matrix set requires environment variables :

-
    -
  • ROK4_TMS_DIRECTORY
  • -
-
-
-
-
-
-
-
-
-

Classes

-
-
-class TileMatrix -(level: Dict[~KT, ~VT], tms: TileMatrixSet) -
-
-

A tile matrix is a tile matrix set's level.

-

Attributes

-
-
id : str
-
TM identifiant (no underscore).
-
tms : TileMatrixSet
-
TMS to whom it belong
-
resolution : float
-
Ground size of a pixel, using unity of the TMS's coordinate system.
-
origin : Tuple[float, float]
-
X,Y coordinates of the upper left corner for the level, the grid's origin.
-
tile_size : Tuple[int, int]
-
Pixel width and height of a tile.
-
matrix_size : Tuple[int, int]
-
Number of tile in the level, widthwise and heightwise.
-
-

Constructor method

-

Args

-
-
level
-
Level attributes, according to JSON structure
-
tms
-
TMS object containing the level to create
-
-

Raises

-
-
MissingAttributeError
-
Attribute is missing in the content
-
-
- -Expand source code - -
class TileMatrix:
-    """A tile matrix is a tile matrix set's level.
-
-    Attributes:
-        id (str): TM identifiant (no underscore).
-        tms (TileMatrixSet): TMS to whom it belong
-        resolution (float): Ground size of a pixel, using unity of the TMS's coordinate system.
-        origin (Tuple[float, float]): X,Y coordinates of the upper left corner for the level, the grid's origin.
-        tile_size (Tuple[int, int]): Pixel width and height of a tile.
-        matrix_size (Tuple[int, int]): Number of tile in the level, widthwise and heightwise.
-    """
-
-    def __init__(self, level: Dict, tms: "TileMatrixSet") -> None:
-        """Constructor method
-
-        Args:
-            level: Level attributes, according to JSON structure
-            tms: TMS object containing the level to create
-
-        Raises:
-            MissingAttributeError: Attribute is missing in the content
-        """
-
-        self.tms = tms
-        try:
-            self.id = level["id"]
-            if self.id.find("_") != -1:
-                raise Exception(
-                    f"TMS {tms.path} owns a level whom id contains an underscore ({self.id})"
-                )
-            self.resolution = level["cellSize"]
-            self.origin = (
-                level["pointOfOrigin"][0],
-                level["pointOfOrigin"][1],
-            )
-            self.tile_size = (
-                level["tileWidth"],
-                level["tileHeight"],
-            )
-            self.matrix_size = (
-                level["matrixWidth"],
-                level["matrixHeight"],
-            )
-            self.__latlon = (
-                self.tms.sr.EPSGTreatsAsLatLong() or self.tms.sr.EPSGTreatsAsNorthingEasting()
-            )
-        except KeyError as e:
-            raise MissingAttributeError(tms.path, f"tileMatrices[].{e}")
-
-    def x_to_column(self, x: float) -> int:
-        """Convert west-east coordinate to tile's column
-
-        Args:
-            x (float): west-east coordinate (TMS coordinates system)
-
-        Returns:
-            int: tile's column
-        """
-        return int((x - self.origin[0]) / (self.resolution * self.tile_size[0]))
-
-    def y_to_row(self, y: float) -> int:
-        """Convert north-south coordinate to tile's row
-
-        Args:
-            y (float): north-south coordinate (TMS coordinates system)
-
-        Returns:
-            int: tile's row
-        """
-        return int((self.origin[1] - y) / (self.resolution * self.tile_size[1]))
-
-    def tile_to_bbox(self, tile_col: int, tile_row: int) -> Tuple[float, float, float, float]:
-        """Get tile terrain extent (xmin, ymin, xmax, ymax), in TMS coordinates system
-
-        TMS spatial reference is Lat / Lon case is handled.
-
-        Args:
-            tile_col (int): column indice
-            tile_row (int): row indice
-
-        Returns:
-            Tuple[float, float, float, float]: terrain extent (xmin, ymin, xmax, ymax)
-        """
-        if self.__latlon:
-            return (
-                self.origin[1] - self.resolution * (tile_row + 1) * self.tile_size[1],
-                self.origin[0] + self.resolution * tile_col * self.tile_size[0],
-                self.origin[1] - self.resolution * tile_row * self.tile_size[1],
-                self.origin[0] + self.resolution * (tile_col + 1) * self.tile_size[0],
-            )
-        else:
-            return (
-                self.origin[0] + self.resolution * tile_col * self.tile_size[0],
-                self.origin[1] - self.resolution * (tile_row + 1) * self.tile_size[1],
-                self.origin[0] + self.resolution * (tile_col + 1) * self.tile_size[0],
-                self.origin[1] - self.resolution * tile_row * self.tile_size[1],
-            )
-
-    def bbox_to_tiles(self, bbox: Tuple[float, float, float, float]) -> Tuple[int, int, int, int]:
-        """Get extrems tile columns and rows corresponding to provided bounding box
-
-        TMS spatial reference is Lat / Lon case is handled.
-
-        Args:
-            bbox (Tuple[float, float, float, float]): bounding box (xmin, ymin, xmax, ymax), in TMS coordinates system
-
-        Returns:
-            Tuple[int, int, int, int]: extrem tiles (col_min, row_min, col_max, row_max)
-        """
-
-        if self.__latlon:
-            return (
-                self.x_to_column(bbox[1]),
-                self.y_to_row(bbox[2]),
-                self.x_to_column(bbox[3]),
-                self.y_to_row(bbox[0]),
-            )
-        else:
-            return (
-                self.x_to_column(bbox[0]),
-                self.y_to_row(bbox[3]),
-                self.x_to_column(bbox[2]),
-                self.y_to_row(bbox[1]),
-            )
-
-    def point_to_indices(self, x: float, y: float) -> Tuple[int, int, int, int]:
-        """Get pyramid's tile and pixel indices from point's coordinates
-
-        TMS spatial reference with Lat / Lon order is handled.
-
-        Args:
-            x (float): point's x
-            y (float): point's y
-
-        Returns:
-            Tuple[int, int, int, int]: tile's column, tile's row, pixel's (in the tile) column, pixel's row
-        """
-
-        if self.__latlon:
-            absolute_pixel_column = int((y - self.origin[0]) / self.resolution)
-            absolute_pixel_row = int((self.origin[1] - x) / self.resolution)
-        else:
-            absolute_pixel_column = int((x - self.origin[0]) / self.resolution)
-            absolute_pixel_row = int((self.origin[1] - y) / self.resolution)
-
-        return (
-            absolute_pixel_column // self.tile_size[0],
-            absolute_pixel_row // self.tile_size[1],
-            absolute_pixel_column % self.tile_size[0],
-            absolute_pixel_row % self.tile_size[1],
-        )
-
-    @property
-    def tile_width(self) -> int:
-        return self.tile_size[0]
-
-    @property
-    def tile_heigth(self) -> int:
-        return self.tile_size[1]
-
-

Instance variables

-
-
prop tile_heigth : int
-
-
-
- -Expand source code - -
@property
-def tile_heigth(self) -> int:
-    return self.tile_size[1]
-
-
-
prop tile_width : int
-
-
-
- -Expand source code - -
@property
-def tile_width(self) -> int:
-    return self.tile_size[0]
-
-
-
-

Methods

-
-
-def bbox_to_tiles(self, bbox: Tuple[float, float, float, float]) ‑> Tuple[int, int, int, int] -
-
-

Get extrems tile columns and rows corresponding to provided bounding box

-

TMS spatial reference is Lat / Lon case is handled.

-

Args

-
-
bbox : Tuple[float, float, float, float]
-
bounding box (xmin, ymin, xmax, ymax), in TMS coordinates system
-
-

Returns

-
-
Tuple[int, int, int, int]
-
extrem tiles (col_min, row_min, col_max, row_max)
-
-
-
-def point_to_indices(self, x: float, y: float) ‑> Tuple[int, int, int, int] -
-
-

Get pyramid's tile and pixel indices from point's coordinates

-

TMS spatial reference with Lat / Lon order is handled.

-

Args

-
-
x : float
-
point's x
-
y : float
-
point's y
-
-

Returns

-
-
Tuple[int, int, int, int]
-
tile's column, tile's row, pixel's (in the tile) column, pixel's row
-
-
-
-def tile_to_bbox(self, tile_col: int, tile_row: int) ‑> Tuple[float, float, float, float] -
-
-

Get tile terrain extent (xmin, ymin, xmax, ymax), in TMS coordinates system

-

TMS spatial reference is Lat / Lon case is handled.

-

Args

-
-
tile_col : int
-
column indice
-
tile_row : int
-
row indice
-
-

Returns

-
-
Tuple[float, float, float, float]
-
terrain extent (xmin, ymin, xmax, ymax)
-
-
-
-def x_to_column(self, x: float) ‑> int -
-
-

Convert west-east coordinate to tile's column

-

Args

-
-
x : float
-
west-east coordinate (TMS coordinates system)
-
-

Returns

-
-
int
-
tile's column
-
-
-
-def y_to_row(self, y: float) ‑> int -
-
-

Convert north-south coordinate to tile's row

-

Args

-
-
y : float
-
north-south coordinate (TMS coordinates system)
-
-

Returns

-
-
int
-
tile's row
-
-
-
-
-
-class TileMatrixSet -(name: str) -
-
-

A tile matrix set is multi levels grid definition

-

Attributes

-
-
name : str
-
TMS's name
-
path : str
-
TMS origin path (JSON)
-
id : str
-
TMS identifier
-
srs : str
-
TMS coordinates system
-
sr : osgeo.osr.SpatialReference
-
TMS OSR spatial reference
-
levels : Dict[str, TileMatrix]
-
TMS levels
-
-

Constructor method

-

Args

-
-
name
-
TMS's name
-
-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
Exception
-
No level in the TMS, CRS not recognized by OSR
-
StorageError
-
Storage read issue
-
FileNotFoundError
-
TMS file or object does not exist
-
FormatError
-
Provided path is not a well formed JSON
-
MissingAttributeError
-
Attribute is missing in the content
-
-
- -Expand source code - -
class TileMatrixSet:
-    """A tile matrix set is multi levels grid definition
-
-    Attributes:
-        name (str): TMS's name
-        path (str): TMS origin path (JSON)
-        id (str): TMS identifier
-        srs (str): TMS coordinates system
-        sr (osgeo.osr.SpatialReference): TMS OSR spatial reference
-        levels (Dict[str, TileMatrix]): TMS levels
-    """
-
-    def __init__(self, name: str) -> None:
-        """Constructor method
-
-        Args:
-            name: TMS's name
-
-        Raises:
-            MissingEnvironmentError: Missing object storage informations
-            Exception: No level in the TMS, CRS not recognized by OSR
-            StorageError: Storage read issue
-            FileNotFoundError: TMS file or object does not exist
-            FormatError: Provided path is not a well formed JSON
-            MissingAttributeError: Attribute is missing in the content
-        """
-
-        self.name = name
-
-        try:
-            self.path = os.path.join(os.environ["ROK4_TMS_DIRECTORY"], f"{self.name}.json")
-        except KeyError as e:
-            raise MissingEnvironmentError(e)
-
-        try:
-            data = json.loads(get_data_str(self.path))
-
-            self.id = data["id"]
-            self.srs = data["crs"]
-            self.sr = srs_to_spatialreference(self.srs)
-            self.levels = {}
-            for level in data["tileMatrices"]:
-                lev = TileMatrix(level, self)
-                self.levels[lev.id] = lev
-
-            if len(self.levels.keys()) == 0:
-                raise Exception(f"TMS '{self.path}' has no level")
-
-            if data["orderedAxes"] != ["X", "Y"] and data["orderedAxes"] != ["Lon", "Lat"]:
-                raise Exception(
-                    f"TMS '{self.path}' own invalid axes order : only X/Y or Lon/Lat are handled"
-                )
-
-        except JSONDecodeError as e:
-            raise FormatError("JSON", self.path, e)
-
-        except KeyError as e:
-            raise MissingAttributeError(self.path, e)
-
-        except RuntimeError as e:
-            raise Exception(
-                f"Wrong attribute 'crs' ('{self.srs}') in '{self.path}', not recognize by OSR. Trace : {e}"
-            )
-
-    def get_level(self, level_id: str) -> "TileMatrix":
-        """Get one level according to its identifier
-
-        Args:
-            level_id: Level identifier
-
-        Returns:
-            The corresponding tile matrix, None if not present
-        """
-
-        return self.levels.get(level_id, None)
-
-    @property
-    def sorted_levels(self) -> List[TileMatrix]:
-        return sorted(self.levels.values(), key=lambda level: level.resolution)
-
-

Instance variables

-
-
prop sorted_levels : List[TileMatrix]
-
-
-
- -Expand source code - -
@property
-def sorted_levels(self) -> List[TileMatrix]:
-    return sorted(self.levels.values(), key=lambda level: level.resolution)
-
-
-
-

Methods

-
-
-def get_level(self, level_id: str) ‑> TileMatrix -
-
-

Get one level according to its identifier

-

Args

-
-
level_id
-
Level identifier
-
-

Returns

-

The corresponding tile matrix, None if not present

-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/utils.html b/2.2.2/rok4/utils.html deleted file mode 100644 index 0abf36c..0000000 --- a/2.2.2/rok4/utils.html +++ /dev/null @@ -1,198 +0,0 @@ - - - - - - -rok4.utils API documentation - - - - - - - - - - - -
-
-
-

Module rok4.utils

-
-
-

Provide functions to manipulate OGR / OSR entities

-
-
-
-
-
-
-

Functions

-
-
-def bbox_to_geometry(bbox: Tuple[float, float, float, float], densification: int = 0) -
-
-

Convert bbox coordinates to OGR geometry

-

Args

-
-
bbox : Tuple[float, float, float, float]
-
bounding box (xmin, ymin, xmax, ymax)
-
densification : int, optional
-
Number of point to add for each side of bounding box. Defaults to 0.
-
-

Raises

-
-
RuntimeError
-
Provided SRS is invalid for OSR
-
-

Returns

-
-
osgeo.ogr.Geometry
-
Corresponding OGR geometry, with spatial reference if provided
-
-
-
-def compute_bbox(source_dataset: osgeo.gdal.Dataset) ‑> Tuple[] -
-
-

Image boundingbox computing method

-

Args

-
-
source_dataset : gdal.Dataset
-
Dataset instanciated -from the raster image
-
-

Limitations

-

Image's axis must be parallel to SRS' axis

-

Raises

-
-
AttributeError
-
source_dataset is not a gdal.Dataset instance.
-
Exception
-
The dataset does not contain transform data.
-
-
-
-def compute_format(dataset: osgeo.gdal.Dataset, path: str = None) ‑> ColorFormat -
-
-

Image color format computing method

-

Args

-
-
dataset : gdal.Dataset
-
Dataset instanciated from the image
-
path : str, optionnal
-
path to the original file/object
-
-

Raises

-
-
AttributeError
-
source_dataset is not a gdal.Dataset instance.
-
Exception
-
No color band found or unsupported color format.
-
-
-
-def reproject_bbox(bbox: Tuple[float, float, float, float], srs_src: str, srs_dst: str, densification: int = 5) ‑> Tuple[float, float, float, float] -
-
-

Return bounding box in other coordinates system

-

Points are added to be sure output bounding box contains input bounding box

-

Args

-
-
bbox : Tuple[float, float, float, float]
-
bounding box (xmin, ymin, xmax, ymax) with source coordinates system
-
srs_src : str
-
source coordinates system
-
srs_dst : str
-
destination coordinates system
-
densification : int, optional
-
Number of point to add for each side of bounding box. Defaults to 5.
-
-

Returns

-
-
Tuple[float, float, float, float]
-
bounding box (xmin, ymin, xmax, ymax) with destination coordinates system
-
-
-
-def reproject_point(point: Tuple[float, float], sr_src: osgeo.osr.SpatialReference, sr_dst: osgeo.osr.SpatialReference) -
-
-

Reproject a point

-

Args

-
-
point : Tuple[float, float]
-
source spatial reference point
-
sr_src : osgeo.osr.SpatialReference
-
source spatial reference
-
sr_dst : osgeo.osr.SpatialReference
-
destination spatial reference
-
-

Returns

-
-
Tuple[float, float]
-
X/Y in destination spatial reference
-
-
-
-def srs_to_spatialreference(srs: str) -
-
-

Convert coordinates system as string to OSR spatial reference

-

Using a cache, to instanciate a Spatial Reference from a string only once.

-

Args

-
-
srs : str
-
coordinates system PROJ4 compliant, with authority and code, like EPSG:3857 or IGNF:LAMB93
-
-

Raises

-
-
RuntimeError
-
Provided SRS is invalid for OSR
-
-

Returns

-
-
osgeo.osr.SpatialReference
-
Corresponding OSR spatial reference
-
-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/rok4/vector.html b/2.2.2/rok4/vector.html deleted file mode 100644 index 63a4793..0000000 --- a/2.2.2/rok4/vector.html +++ /dev/null @@ -1,355 +0,0 @@ - - - - - - -rok4.vector API documentation - - - - - - - - - - - -
-
-
-

Module rok4.vector

-
-
-

Provide class to read informations on vector data from file path or object path

-

The module contains the following class :

- -
-
-
-
-
-
-
-
-

Classes

-
-
-class Vector -
-
-

A data vector

-

Attributes

-
-
path : str
-
path to the file/object
-
bbox : Tuple[float, float, float, float]
-
bounding rectange in the data projection
-
-

layers (List[Tuple[str, int, List[Tuple[str, str]]]]) : Vector layers with their name, their number of objects and their attributes

-
- -Expand source code - -
class Vector:
-    """A data vector
-
-    Attributes:
-        path (str): path to the file/object
-        bbox (Tuple[float, float, float, float]): bounding rectange in the data projection
-        layers (List[Tuple[str, int, List[Tuple[str, str]]]]) : Vector layers with their name, their number of objects and their attributes
-    """
-
-    @classmethod
-    def from_file(cls, path: str, **kwargs) -> "Vector":
-        """Constructor method of a Vector from a file (Shapefile, Geopackage, CSV and GeoJSON)
-
-        Args:
-            path (str): path to the file/object
-            **csv (Dict[str : str]) : dictionnary of CSV parameters :
-                -srs (str) ("EPSG:2154" if not provided) : spatial reference system of the geometry
-                -column_x (str) ("x" if not provided) : field of the x coordinate
-                -column_y (str) ("y" if not provided) : field of the y coordinate
-                -column_wkt (str) (None if not provided) : field of the WKT of the geometry if WKT use to define coordinate
-
-        Examples:
-
-            from rok4.vector import Vector
-
-            try:
-                vector = Vector.from_file("file://tests/fixtures/ARRONDISSEMENT.shp")
-                vector_csv1 = Vector.from_file("file://tests/fixtures/vector.csv" , csv={"delimiter":";", "column_x":"x", "column_y":"y"})
-                vector_csv2 = Vector.from_file("file://tests/fixtures/vector2.csv" , csv={"delimiter":";", "column_wkt":"WKT"})
-
-            except Exception as e:
-                print(f"Vector creation raises an exception: {exc}")
-
-        Raises:
-            MissingEnvironmentError: Missing object storage informations
-            StorageError: Storage read issue
-            Exception: Wrong column
-            Exception: Wrong data in column
-            Exception: Wrong format of file
-            Exception: Wrong data in the file
-
-        """
-
-        self = cls()
-
-        self.path = path
-
-        path_split = path.split("/")
-
-        if path_split[0] == "ceph:" or path.endswith(".csv"):
-            if path.endswith(".shp"):
-                with tempfile.TemporaryDirectory() as tmp:
-                    tmp_path = tmp + "/" + path_split[-1][:-4]
-
-                    copy(path, "file://" + tmp_path + ".shp")
-                    copy(path[:-4] + ".shx", "file://" + tmp_path + ".shx")
-                    copy(path[:-4] + ".cpg", "file://" + tmp_path + ".cpg")
-                    copy(path[:-4] + ".dbf", "file://" + tmp_path + ".dbf")
-                    copy(path[:-4] + ".prj", "file://" + tmp_path + ".prj")
-
-                    dataSource = ogr.Open(tmp_path + ".shp", 0)
-
-            elif path.endswith(".gpkg"):
-                with tempfile.TemporaryDirectory() as tmp:
-                    tmp_path = tmp + "/" + path_split[-1][:-5]
-
-                    copy(path, "file://" + tmp_path + ".gpkg")
-
-                    dataSource = ogr.Open(tmp_path + ".gpkg", 0)
-
-            elif path.endswith(".geojson"):
-                with tempfile.TemporaryDirectory() as tmp:
-                    tmp_path = tmp + "/" + path_split[-1][:-8]
-
-                    copy(path, "file://" + tmp_path + ".geojson")
-
-                    dataSource = ogr.Open(tmp_path + ".geojson", 0)
-
-            elif path.endswith(".csv"):
-                # Récupération des informations optionnelles
-                if "csv" in kwargs:
-                    csv = kwargs["csv"]
-                else:
-                    csv = {}
-
-                if "srs" in csv and csv["srs"] is not None:
-                    srs = csv["srs"]
-                else:
-                    srs = "EPSG:2154"
-
-                if "column_x" in csv and csv["column_x"] is not None:
-                    column_x = csv["column_x"]
-                else:
-                    column_x = "x"
-
-                if "column_y" in csv and csv["column_y"] is not None:
-                    column_y = csv["column_y"]
-                else:
-                    column_y = "y"
-
-                if "column_wkt" in csv:
-                    column_wkt = csv["column_wkt"]
-                else:
-                    column_wkt = None
-
-                with tempfile.TemporaryDirectory() as tmp:
-                    tmp_path = tmp + "/" + path_split[-1][:-4]
-                    name_fich = path_split[-1][:-4]
-
-                    copy(path, "file://" + tmp_path + ".csv")
-
-                    with tempfile.NamedTemporaryFile(
-                        mode="w", suffix=".vrt", dir=tmp, delete=False
-                    ) as tmp2:
-                        vrt_file = "<OGRVRTDataSource>\n"
-                        vrt_file += '<OGRVRTLayer name="' + name_fich + '">\n'
-                        vrt_file += "<SrcDataSource>" + tmp_path + ".csv</SrcDataSource>\n"
-                        vrt_file += "<SrcLayer>" + name_fich + "</SrcLayer>\n"
-                        vrt_file += "<LayerSRS>" + srs + "</LayerSRS>\n"
-                        if column_wkt is None:
-                            vrt_file += (
-                                '<GeometryField encoding="PointFromColumns" x="'
-                                + column_x
-                                + '" y="'
-                                + column_y
-                                + '"/>\n'
-                            )
-                        else:
-                            vrt_file += (
-                                '<GeometryField encoding="WKT" field="' + column_wkt + '"/>\n'
-                            )
-                        vrt_file += "</OGRVRTLayer>\n"
-                        vrt_file += "</OGRVRTDataSource>"
-                        tmp2.write(vrt_file)
-                    dataSourceVRT = ogr.Open(tmp2.name, 0)
-                    os.remove(tmp2.name)
-                    dataSource = ogr.GetDriverByName("ESRI Shapefile").CopyDataSource(
-                        dataSourceVRT, tmp_path + "shp"
-                    )
-
-            else:
-                raise Exception("This format of file cannot be loaded")
-
-        else:
-            dataSource = ogr.Open(get_osgeo_path(path), 0)
-
-        multipolygon = ogr.Geometry(ogr.wkbGeometryCollection)
-        try:
-            layer = dataSource.GetLayer()
-        except AttributeError:
-            raise Exception(f"The content of {self.path} cannot be read")
-
-        layers = []
-        for i in range(dataSource.GetLayerCount()):
-            layer = dataSource.GetLayer(i)
-            name = layer.GetName()
-            count = layer.GetFeatureCount()
-            layerDefinition = layer.GetLayerDefn()
-            attributes = []
-            for j in range(layerDefinition.GetFieldCount()):
-                fieldName = layerDefinition.GetFieldDefn(j).GetName()
-                fieldTypeCode = layerDefinition.GetFieldDefn(j).GetType()
-                fieldType = layerDefinition.GetFieldDefn(j).GetFieldTypeName(fieldTypeCode)
-                attributes += [(fieldName, fieldType)]
-            for feature in layer:
-                geom = feature.GetGeometryRef()
-                if geom is not None:
-                    multipolygon.AddGeometry(geom)
-            layers += [(name, count, attributes)]
-
-        self.layers = layers
-        self.bbox = multipolygon.GetEnvelope()
-
-        return self
-
-    @classmethod
-    def from_parameters(cls, path: str, bbox: tuple, layers: list) -> "Vector":
-        """Constructor method of a Vector from a parameters
-
-        Args:
-            path (str): path to the file/object
-            bbox (Tuple[float, float, float, float]): bounding rectange in the data projection
-            layers (List[Tuple[str, int, List[Tuple[str, str]]]]) : Vector layers with their name, their number of objects and their attributes
-
-        Examples:
-
-            try :
-                vector = Vector.from_parameters("file://tests/fixtures/ARRONDISSEMENT.shp", (1,2,3,4), [('ARRONDISSEMENT', 14, [('ID', 'String'), ('NOM', 'String'), ('INSEE_ARR', 'String'), ('INSEE_DEP', 'String'), ('INSEE_REG', 'String'), ('ID_AUT_ADM', 'String'), ('DATE_CREAT', 'String'), ('DATE_MAJ', 'String'), ('DATE_APP', 'Date'), ('DATE_CONF', 'Date')])])
-
-            except Exception as e:
-                print(f"Vector creation raises an exception: {exc}")
-
-        """
-
-        self = cls()
-
-        self.path = path
-        self.bbox = bbox
-        self.layers = layers
-
-        return self
-
-

Static methods

-
-
-def from_file(path: str, **kwargs) ‑> Vector -
-
-

Constructor method of a Vector from a file (Shapefile, Geopackage, CSV and GeoJSON)

-

Args

-
-
path : str
-
path to the file/object
-
-

**csv (Dict[str : str]) : dictionnary of CSV parameters : --srs (str) ("EPSG:2154" if not provided) : spatial reference system of the geometry --column_x (str) ("x" if not provided) : field of the x coordinate --column_y (str) ("y" if not provided) : field of the y coordinate --column_wkt (str) (None if not provided) : field of the WKT of the geometry if WKT use to define coordinate

-

Examples

-

from rok4.vector import Vector

-

try: -vector = Vector.from_file("file://tests/fixtures/ARRONDISSEMENT.shp") -vector_csv1 = Vector.from_file("file://tests/fixtures/vector.csv" , csv={"delimiter":";", "column_x":"x", "column_y":"y"}) -vector_csv2 = Vector.from_file("file://tests/fixtures/vector2.csv" , csv={"delimiter":";", "column_wkt":"WKT"})

-

except Exception as e: -print(f"Vector creation raises an exception: {exc}")

-

Raises

-
-
MissingEnvironmentError
-
Missing object storage informations
-
StorageError
-
Storage read issue
-
Exception
-
Wrong column
-
Exception
-
Wrong data in column
-
Exception
-
Wrong format of file
-
Exception
-
Wrong data in the file
-
-
-
-def from_parameters(path: str, bbox: tuple, layers: list) ‑> Vector -
-
-

Constructor method of a Vector from a parameters

-

Args

-
-
path : str
-
path to the file/object
-
bbox : Tuple[float, float, float, float]
-
bounding rectange in the data projection
-
-

layers (List[Tuple[str, int, List[Tuple[str, str]]]]) : Vector layers with their name, their number of objects and their attributes

-

Examples

-

try : -vector = Vector.from_parameters("file://tests/fixtures/ARRONDISSEMENT.shp", (1,2,3,4), [('ARRONDISSEMENT', 14, [('ID', 'String'), ('NOM', 'String'), ('INSEE_ARR', 'String'), ('INSEE_DEP', 'String'), ('INSEE_REG', 'String'), ('ID_AUT_ADM', 'String'), ('DATE_CREAT', 'String'), ('DATE_MAJ', 'String'), ('DATE_APP', 'Date'), ('DATE_CONF', 'Date')])])

-

except Exception as e: -print(f"Vector creation raises an exception: {exc}")

-
-
-
-
-
-
- -
- - - diff --git a/2.2.2/search/search_index.json b/2.2.2/search/search_index.json deleted file mode 100644 index 61cb848..0000000 --- a/2.2.2/search/search_index.json +++ /dev/null @@ -1 +0,0 @@ -{"config":{"lang":["fr"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Librairies ROK4 Python","text":"

Ces librairies facilitent la manipulation d'entit\u00e9s du projet ROK4 comme les Tile Matrix Sets, les pyramides ou encore les couches, ainsi que la manipulation des stockages associ\u00e9s.

"},{"location":"#installer-la-librairie","title":"Installer la librairie","text":"

Installations syst\u00e8me requises :

  • debian : apt install python3-rados python3-gdal

Depuis PyPI : pip install rok4

Depuis GitHub : pip install https://github.com/rok4/core-python/releases/download/2.2.2/rok4-2.2.2-py3-none-any.whl

L'environnement d'ex\u00e9cution doit avoir acc\u00e8s aux librairies syst\u00e8me. Dans le cas d'une utilisation au sein d'un environnement python, pr\u00e9cisez bien \u00e0 la cr\u00e9ation python3 -m venv --system-site-packages .venv.

"},{"location":"#utiliser-la-librairie","title":"Utiliser la librairie","text":"
from rok4.tile_matrix_set import TileMatrixSet\n\ntry:\n    tms = TileMatrixSet(\"file:///path/to/tms.json\")\nexcept Exception as exc:\n    print(exc)\n

Les variables d'environnement suivantes peuvent \u00eatre n\u00e9cessaires, par module :

  • storage : plus de d\u00e9tails dans la documentation technique du module
    • ROK4_READING_LRU_CACHE_SIZE : Nombre d'\u00e9l\u00e9ment dans le cache de lecture (0 pour ne pas avoir de limite)
    • ROK4_READING_LRU_CACHE_TTL : Dur\u00e9e de validit\u00e9 d'un \u00e9l\u00e9ment du cache, en seconde (0 pour ne pas avoir de limite)
    • ROK4_CEPH_CONFFILE : Fichier de configuration du cluster Ceph
    • ROK4_CEPH_USERNAME : Compte d'acc\u00e8s au cluster Ceph
    • ROK4_CEPH_CLUSTERNAME : Nom du cluster Ceph
    • ROK4_S3_KEY : Cl\u00e9(s) de(s) serveur(s) S3
    • ROK4_S3_SECRETKEY : Cl\u00e9(s) secr\u00e8te(s) de(s) serveur(s) S3
    • ROK4_S3_URL : URL de(s) serveur(s) S3
    • ROK4_SSL_NO_VERIFY : D\u00e9sactivation de la v\u00e9rification SSL pour les acc\u00e8s S3 (n'importe quelle valeur non vide)
  • tile_matrix_set :
    • ROK4_TMS_DIRECTORY : Dossier racine (fichier ou objet) des tile matrix sets
  • style :
    • ROK4_STYLES_DIRECTORY : Dossier racine (fichier ou objet) des styles

Readings uses a LRU cache system with a TTL. It's possible to configure it with environment variables : - ROK4_READING_LRU_CACHE_SIZE : Number of cached element. Default 64. Set 0 or a negative integer to configure a cache without bound. A power of two make cache more efficient. - ROK4_READING_LRU_CACHE_TTL : Validity duration of cached element, in seconds. Default 300. 0 or negative integer to get cache without expiration date.

To disable cache (always read data on storage), set ROK4_READING_LRU_CACHE_SIZE to 1 and ROK4_READING_LRU_CACHE_TTL to 1.

Using CEPH storage requires environment variables :

Using S3 storage requires environment variables :

Plus d'exemple dans la documentation d\u00e9veloppeur.

"},{"location":"#contribuer","title":"Contribuer","text":"
  • Installer les d\u00e9pendances de d\u00e9veloppement :

    sh python3 -m pip install -e .[dev] pre-commit install

  • Consulter les directives de contribution

"},{"location":"#compiler-la-librairie","title":"Compiler la librairie","text":"
apt install python3-venv python3-rados python3-gdal\npython3 -m venv .venv\nsource .venv/bin/activate\npython3 -m pip install --upgrade build bump2version\nbump2version --current-version 0.0.0 --new-version 2.2.2 patch\n\n# Run unit tests\npython3 -m pip install -e .[test]\n# To use system installed modules rados and osgeo\necho \"/usr/lib/python3/dist-packages/\" >.venv/lib/python3.10/site-packages/system.pth\npython3 -c 'import sys; print (sys.path)'\n# Run tests\ncoverage run -m pytest\n# Get tests report and generate site\ncoverage report -m\ncoverage html -d dist/tests/\n\n# Build documentation\npython3 -m pip install -e .[doc]\npdoc3 --html --output-dir dist/ rok4\n\n# Build artefacts\npython3 -m build\n

Remarque :

Lors de l'installation du paquet apt python3-gdal, une d\u00e9pendance, peut demander des interactions de configuration. Pour installer dans un environnement non-interactif, d\u00e9finir la variable shell DEBIAN_FRONTEND=noninteractive permet d'adopter une configuration par d\u00e9faut.

"},{"location":"#publier-la-librairie-sur-pypi","title":"Publier la librairie sur Pypi","text":"

Configurer le fichier $HOME/.pypirc avec les acc\u00e8s \u00e0 votre compte PyPI.

python3 -m pip install --upgrade twine\npython3 -m twine upload --repository pypi dist/rok4-2.2.2-py3-none-any.whl dist/rok4-2.2.2.tar.gz\n
"},{"location":"CHANGELOG/","title":"Historique des versions","text":""},{"location":"CHANGELOG/#222","title":"2.2.2","text":""},{"location":"CHANGELOG/#changed","title":"[Changed]","text":"
  • Module storage : il est possible de l'utiliser sans avoir la librairie GDAL : seule la fonction get_osgeo_path pour du S3 ne sera pas disponible
"},{"location":"CHANGELOG/#220","title":"2.2.0","text":""},{"location":"CHANGELOG/#added","title":"[Added]","text":"
  • Ajout de la librairie de gestion d'un style ROK4
"},{"location":"CHANGELOG/#215","title":"2.1.5","text":""},{"location":"CHANGELOG/#changed_1","title":"[Changed]","text":"
  • Pyramid : la fonction de chargement de la liste en m\u00e9moire retourne le nombre de dalle
"},{"location":"CHANGELOG/#214","title":"2.1.4","text":""},{"location":"CHANGELOG/#fixed","title":"[Fixed]","text":"
  • Storage : la r\u00e9ponse \u00e0 un HEAD (test existence en S3) donne un code 404 et non NoSuchKey (confusion avec la lecture d'objet)
  • RasterSet: le chargement d'un raster set \u00e0 partir d'un fichier ou d'un descripteur utilise la librairie Storage et non la librairie GDAL
"},{"location":"CHANGELOG/#213","title":"2.1.3","text":""},{"location":"CHANGELOG/#fixed_1","title":"[Fixed]","text":"
  • Storage : dans le cas d'une lecture ou d'un test existence sur un objet S3 absent, le code dans la r\u00e9ponse n'est pas 404 mais NoSuchKey
"},{"location":"CHANGELOG/#210","title":"2.1.0","text":""},{"location":"CHANGELOG/#added_1","title":"[Added]","text":"
  • Pyramid
    • Ajout de fonctions pour r\u00e9cup\u00e9rer la tile_limits et le nombre de canaux de cette pyramide
    • Ajout de fonctions pour ajouter ou supprimer des niveaux dans une pyramide
  • TileMatrixSet
    • Ajout de fonctions pour r\u00e9cup\u00e9rer la hauteur et la largeur de tuiles d'un TileMatrixSet
"},{"location":"CHANGELOG/#changed_2","title":"[Changed]","text":"
  • Pyramid
    • Ajout d'un param\u00e8tre optionnel \"mask\" pour le constructeur from other afin de pouvoir conserver ou non les masques de la pyramide servant de base \u00e0 la nouvellle
  • Gestion des documentations des diff\u00e9rentes versions avec l'outil mike
"},{"location":"CHANGELOG/#201","title":"2.0.1","text":""},{"location":"CHANGELOG/#added_2","title":"[Added]","text":"
  • storage : le cache de lecture est configurable en taille (avec ROK4_READING_LRU_CACHE_SIZE) et en temps de r\u00e9tention (avec ROK4_READING_LRU_CACHE_TTL)
"},{"location":"CHANGELOG/#security","title":"[Security]","text":"
  • Mont\u00e9e de version de pillow (faille de s\u00e9curit\u00e9 li\u00e9e \u00e0 libwebp)
"},{"location":"CHANGELOG/#200","title":"2.0.0","text":""},{"location":"CHANGELOG/#fixed_2","title":"[Fixed]","text":"
  • Pyramid
    • quand on lit une tuile dans une pyramide PNG 1 canal, on retourne bien aussi un numpy.array \u00e0 3 dimensions (la derni\u00e8re dimension sera bien un array \u00e0 un \u00e9l\u00e9ment)
"},{"location":"CHANGELOG/#changed_3","title":"[Changed]","text":"
  • Storage
    • Le client S3 garde ouverte des connexions
    • La fonction get_data_binary a un syst\u00e8me de cache de type LRU, avec un temps de validit\u00e9 de 5 minutes
"},{"location":"CHANGELOG/#171","title":"1.7.1","text":""},{"location":"CHANGELOG/#added_3","title":"[Added]","text":"
  • Raster

    • Classe RasterSet, r\u00e9pr\u00e9sentant une collection d'objets de la classe Raster, avec des informations suppl\u00e9mentaires
    • M\u00e9thodes d'import et export des informations extraites par une instance RasterSet, au travers d'un descripteur (fichier ou objet json, voire sortie standard)
    • Documentation interne
    • Tests unitaires pour la classe RasterSet
    • Classe Raster : constructeur \u00e0 partir des param\u00e8tres
  • Pyramid

    • Fonction de calcul de la taille d'une pyramide
    • G\u00e9n\u00e9rateur de lecture de la liste du contenu
  • Storage

    • Fonction de calcul de la taille des fichiers d'un chemin selon le stockage
    • Ajout de la copie de HTTP vers FILE/S3/CEPH
    • Ajout de la fonction de lecture d'un fichier HTTP, de l'existence d'un fichier HTTP et du calcul de taille d'un fichier HTTP
"},{"location":"CHANGELOG/#changed_4","title":"[Changed]","text":"
  • Raster
    • Homog\u00e9n\u00e9isation du code
    • Mise en conformit\u00e9 PEP-8
  • test_Raster
    • Homog\u00e9n\u00e9isation du code
    • Mise en conformit\u00e9 PEP-8
  • Utils
    • Mise en conformit\u00e9 PEP-8 des fonctions compute_bbox et compute_format
"},{"location":"CHANGELOG/#fixed_3","title":"[Fixed]","text":"
  • Utils
    • Correction d'un nom de variable dans la fonction compute_format, qui \u00e9crasait une fonction du noyau python.
"},{"location":"CHANGELOG/#160","title":"1.6.0","text":"

Lecture par syst\u00e8me de fichier virtuel avec GDAL

"},{"location":"CHANGELOG/#added_4","title":"[Added]","text":"
  • Storage
    • Fonction get_osgeo_path permettant de configurer le bon syt\u00e8me de fichier virtuel en fonction du chemin fourni, et retourne celui \u00e0 utiliser dans le Open de gdal ou ogr
"},{"location":"CHANGELOG/#changed_5","title":"[Changed]","text":"
  • Storage
    • la r\u00e9cup\u00e9ration d'un client S3 (__get_s3_client) permet de r\u00e9cup\u00e9rer le client, l'h\u00f4te, les cl\u00e9s d'acc\u00e8s et secr\u00e8te, ainsi que le nom du bucket sans l'\u00e9ventuel h\u00f4te du cluster
"},{"location":"CHANGELOG/#fixed_4","title":"[Fixed]","text":"
  • Storage
    • Lecture binaire S3 : mauvaise configuration du nom du bucket et de l'objet et mauvaise lecture partielle
"},{"location":"CHANGELOG/#removed","title":"[Removed]","text":"
  • Exceptions
    • NotImplementedError est une exceptions native
"},{"location":"CHANGELOG/#150","title":"1.5.0","text":""},{"location":"CHANGELOG/#added_5","title":"[Added]","text":"
  • Level
    • Fonction de test d'une tuile is_in_limits : ses indices sont ils dans les limites du niveau ?
  • Pyramid
    • La lecture d'une tuile v\u00e9rifie avant que les indices sont bien dans les limites du niveau
    • Les exceptions lev\u00e9es lors du d\u00e9codage de la tuile raster emettent une exception FormatError
    • get_tile_indices accepte en entr\u00e9e un syst\u00e8me de coordonn\u00e9es : c'est celui des coordonn\u00e9es fournies et permet de faire une reprojection si celui ci n'est pas le m\u00eame que celui des donn\u00e9es dans la pyramide
  • Utils
    • Meilleure gestion de reprojection par reproject_bbox : on d\u00e9tecte des syst\u00e8mes identiques en entr\u00e9e ou quand seul l'ordre des axes changent, pour \u00e9viter le calcul
    • Ajout de la fonction de reprojection d'un point reproject_point : on d\u00e9tecte des syst\u00e8mes identiques en entr\u00e9e ou quand seul l'ordre des axes changent, pour \u00e9viter le calcul
"},{"location":"CHANGELOG/#changed_6","title":"[Changed]","text":"
  • Utils :
    • bbox_to_geometry : on ne fournit plus de syst\u00e8me de coordonn\u00e9es, la fonction se content de cr\u00e9er la g\u00e9om\u00e9trie OGR \u00e0 partir de la bbox, avec \u00e9ventuellement une densification en points des bords
  • Pyramid :
    • Renommage de fonction : update_limits -> set_limits_from_bbox. Le but est d'\u00eatre plus explicite sur le fonctionnement de la fonction (on \u00e9crase les limites, on ne les met pas juste \u00e0 jour par union avec la bbox fournie)
"},{"location":"CHANGELOG/#144","title":"1.4.4","text":"

Ajout de fonctionnalit\u00e9s de lecture de donn\u00e9e d'une pyramide et suivi des recommandations PyPA pour la gestion du projet.

"},{"location":"CHANGELOG/#added_6","title":"[Added]","text":"
  • TileMatrix :
    • Fonction de calcul des indices de tuile et de pixel dans la tuile \u00e0 partir d'un point dans le syst\u00e8me de coordonn\u00e9es du TMS
  • Pyramid :
    • Fonction de calcul des indices de tuile et de pixel dans la tuile \u00e0 partir d'un point dans le syst\u00e8me de coordonn\u00e9es du TMS et \u00e9ventuellement un niveau
    • Fonctions de lecture d'une tuile : au format binaire source ou au format tableau \u00e0 3 dimensions pour les tuiles raster
  • Storage :
    • Fonction de lecture binaire, compl\u00e8te ou partielle, d'un fichier ou objet S3 ou CEPH
  • Exceptions : NotImplementedError permet de pr\u00e9ciser qu'une fonctionnalit\u00e9 n'a pas \u00e9t\u00e9 impl\u00e9ment\u00e9e pour tous les cas. Ici, on ne g\u00e8re pas la d\u00e9compression des donn\u00e9es raster pour les compressions packbit et LZW

  • Ajout de la publication PyPI dans la CI GitHub

"},{"location":"CHANGELOG/#changed_7","title":"[Changed]","text":"
  • Storage :
    • La lecture sous forme de cha\u00eene s'appuie sur la lecture compl\u00e8te binaire. Aucun changement \u00e0 l'usage.
  • TileMatrixSet : quelque soit le syst\u00e8me de coordonn\u00e9es, on ne g\u00e8re que un ordre des axes X,Y ou Lon,Lat. Cependant, les fonctions de calcul de ou \u00e0 partir de bbox respectent l'ordre du syst\u00e8me dans ces derni\u00e8res.

  • Passage de la configuration du projet dans le fichier pyproject.toml

"},{"location":"CHANGELOG/#130","title":"1.3.0","text":"

Ajout de la librairie de lecture de donn\u00e9es vecteur, de tests unitaires et ajout de fonctionnalit\u00e9 pour le stockage. Am\u00e9lioration de la gestion du projet et de l'int\u00e9gration continue.

"},{"location":"CHANGELOG/#added_7","title":"[Added]","text":"
  • Librairie de lecture de donn\u00e9es vecteur :
  • Chargement de donn\u00e9es vecteur pour des fichiers shapefile, Geopackage, CSV et GeoJSON
  • Ecriture des tests unitaires
  • Librairie Pyramid : compl\u00e9tion des tests unitaires
  • Librairie Storage : prise en charge de la copie CEPH -> S3
  • Gestion du projet (compilations, d\u00e9pendances...) via poetry
  • Injection de la version dans le fichier pyproject.toml et __init__.py (d\u00e9finition de la variable __version__)
  • \u00c9volution de la CI github
    • V\u00e9rification des installations et tests unitaires sous ubuntu 20.04 python 3.8 et ubuntu 22.04 python 3.10
    • Publication de l'artefact avec les r\u00e9sultats des tests unitaires
    • Nettoyage de la release en cas d'erreur
    • Compilation de la documentation et publication sur la branche gh-pages
"},{"location":"CHANGELOG/#120","title":"1.2.0","text":"

Ajout des librairies pour l'utilitaire make-layer.py

"},{"location":"CHANGELOG/#added_8","title":"[Added]","text":"
  • Librairie Storage : compl\u00e9tion des tests unitaires

  • Librairie Pyramid :

  • Ajout de getter sur les niveaux du haut et du bas

  • Ajout de la librairie de gestion d'une couche Layer :

  • Chargement d'une couche depuis des param\u00e8tres
  • Chargement d'une couche depuis un descripteur
  • \u00c9criture du descripteur au format attendu par le serveur
  • \u00c9criture des tests unitaires

  • Ajout d'une librairie d'utilitaires Utils

  • Conversion d'un SRS en objet OSR SpatialReference
  • Conversion d'une bbox en objet OGR Geometry
  • Reprojection d'une bbox avec densification des c\u00f4t\u00e9s et reprojection partielle
  • \u00c9criture des tests unitaires

  • Configuration de l'outil coverage pour voir la couverture des tests unitaires

"},{"location":"CHANGELOG/#110","title":"1.1.0","text":"

Prise en charge de plusieurs clusters S3 de stockage.

"},{"location":"CHANGELOG/#added_9","title":"[Added]","text":"
  • Librairie d'abstraction du stockage :
  • Prise en charge de plusieurs clusters S3. Les variables d'environnement pour le stockage S3 pr\u00e9cisent plusieurs valeurs s\u00e9par\u00e9es par des virgules, et les noms des buckets peuvent \u00eatre suffix\u00e9s par \"@{S3 cluster host}\". Par d\u00e9faut, le premier cluster d\u00e9fini est utilis\u00e9. L'h\u00f4te du cluster n'est jamais \u00e9crit dans le descripteur de pyramide ou le fichier liste (puisque stock\u00e9s sur le cluster, on sait sur lequel sont les objets). Les objets symboliques ne le pr\u00e9cisent pas non plus et ne peuvent \u00eatre qu'au sein d'un cluster S3
"},{"location":"CHANGELOG/#100","title":"1.0.0","text":"

Initialisation des librairies Python utilis\u00e9es par les outils python \u00e0 venir du d\u00e9p\u00f4t pytools.

"},{"location":"CHANGELOG/#added_10","title":"[Added]","text":"
  • Librairie d'abstraction du stockage (S3, CEPH ou FILE)
  • r\u00e9cup\u00e9ration du contenu sous forme de string
  • \u00e9criture d'un contenu string
  • cr\u00e9ation d'un lien symbolique
  • copie fichier/objet <-> fichier/objet
  • Librairie de chargement d'un Tile Matrix Set
  • Librairie de gestion d'un descripteur de pyramide
  • chargement depuis un descripteur ou par clone (avec changement de stockage)
  • \u00e9criture du descripteur
  • Tests unitaires couvrant ces librairies
"},{"location":"CONTRIBUTING/","title":"Directives de contribution","text":"

Merci d'envisager de contribuer \u00e0 ce projet !

"},{"location":"CONTRIBUTING/#git-hooks","title":"Git hooks","text":"

Nous utilisons les git hooks via pre-commit pour appliquer et v\u00e9rifier automatiquement certaines \"r\u00e8gles\". Veuillez l'installer avant de pousser un commit.

Voir le fichier de configuration correspondant : .pre-commit-config.yaml.

"},{"location":"CONTRIBUTING/#pull-request","title":"Pull request","text":"

Le titre de la PR est utilis\u00e9 pour constituer automatiquement les notes de release. Vous pouvez pr\u00e9ciser en commentaire de votre PR des d\u00e9tails qui seront ajout\u00e9s dans le fichier CHANGELOG.md par les mainteneurs du projet.

Le formalisme du changelog est le suivant, en markdown :

### [Added]\n\nListe de nouvelles fonctionnalit\u00e9s.\n\n### [Changed]\n\nListe de fonctionnalit\u00e9s existantes modifi\u00e9es.\n\n### [Deprecated]\n\nListe de fonctionnalit\u00e9s d\u00e9pr\u00e9ci\u00e9es.\n\n### [Removed]\n\nListe de foncitonnalit\u00e9s retir\u00e9es.\n\n### [Fixed]\n\nListe de corrections fonctionnelles.\n\n### [Security]\n\nListe de corrections de s\u00e9curit\u00e9.\n

Les parties vides, sans \u00e9l\u00e9ment \u00e0 lister, peuvent \u00eatre ignor\u00e9es.

"},{"location":"documentation/","title":"Documentation technique","text":""},{"location":"unit-tests/","title":"Rapport des tests unitaires","text":""}]} \ No newline at end of file diff --git a/2.2.2/sitemap.xml b/2.2.2/sitemap.xml deleted file mode 100644 index 2fdef08..0000000 --- a/2.2.2/sitemap.xml +++ /dev/null @@ -1,23 +0,0 @@ - - - - https://rok4.github.io/core-python/2.2.2/ - 2024-10-01 - - - https://rok4.github.io/core-python/2.2.2/CHANGELOG/ - 2024-10-01 - - - https://rok4.github.io/core-python/2.2.2/CONTRIBUTING/ - 2024-10-01 - - - https://rok4.github.io/core-python/2.2.2/documentation/ - 2024-10-01 - - - https://rok4.github.io/core-python/2.2.2/unit-tests/ - 2024-10-01 - - \ No newline at end of file diff --git a/2.2.2/sitemap.xml.gz b/2.2.2/sitemap.xml.gz deleted file mode 100644 index 55d0f17..0000000 Binary files a/2.2.2/sitemap.xml.gz and /dev/null differ diff --git a/2.2.2/tests/class_index.html b/2.2.2/tests/class_index.html deleted file mode 100644 index dc2c366..0000000 --- a/2.2.2/tests/class_index.html +++ /dev/null @@ -1,371 +0,0 @@ - - - - - Coverage report - - - - - -
-
-

Coverage report: - 84% -

- -
- -
- - -
-
-

- Files - Functions - Classes -

-

- coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

-
-
-
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Fileclassstatementsmissingexcludedcoverage
src/rok4/enums.pyPyramidType000100%
src/rok4/enums.pySlabType000100%
src/rok4/enums.pyStorageType000100%
src/rok4/enums.pyColorFormat000100%
src/rok4/enums.py(no class)1700100%
src/rok4/exceptions.pyMissingAttributeError300100%
src/rok4/exceptions.pyMissingEnvironmentError200100%
src/rok4/exceptions.pyStorageError300100%
src/rok4/exceptions.pyFormatError400100%
src/rok4/exceptions.py(no class)800100%
src/rok4/layer.pyLayer10520081%
src/rok4/layer.py(no class)2700100%
src/rok4/pyramid.pyLevel6023062%
src/rok4/pyramid.pyPyramid301102066%
src/rok4/pyramid.py(no class)13600100%
src/rok4/raster.pyRaster3500100%
src/rok4/raster.pyRasterSet644094%
src/rok4/raster.py(no class)2700100%
src/rok4/storage.py(no class)538107080%
src/rok4/style.pyColour205075%
src/rok4/style.pyPalette363092%
src/rok4/style.pySlope82075%
src/rok4/style.pyExposition72071%
src/rok4/style.pyEstompage82075%
src/rok4/style.pyLegend92078%
src/rok4/style.pyStyle541098%
src/rok4/style.py(no class)3500100%
src/rok4/tile_matrix_set.pyTileMatrix282093%
src/rok4/tile_matrix_set.pyTileMatrixSet261096%
src/rok4/tile_matrix_set.py(no class)2300100%
src/rok4/utils.py(no class)1033097%
src/rok4/vector.pyVector9012087%
src/rok4/vector.py(no class)1000100%
Total 1787291084%
-

- No items found using the specified filter. -

-
- - - diff --git a/2.2.2/tests/coverage_html_cb_6fb7b396.js b/2.2.2/tests/coverage_html_cb_6fb7b396.js deleted file mode 100644 index 1face13..0000000 --- a/2.2.2/tests/coverage_html_cb_6fb7b396.js +++ /dev/null @@ -1,733 +0,0 @@ -// Licensed under the Apache License: http://www.apache.org/licenses/LICENSE-2.0 -// For details: https://github.com/nedbat/coveragepy/blob/master/NOTICE.txt - -// Coverage.py HTML report browser code. -/*jslint browser: true, sloppy: true, vars: true, plusplus: true, maxerr: 50, indent: 4 */ -/*global coverage: true, document, window, $ */ - -coverage = {}; - -// General helpers -function debounce(callback, wait) { - let timeoutId = null; - return function(...args) { - clearTimeout(timeoutId); - timeoutId = setTimeout(() => { - callback.apply(this, args); - }, wait); - }; -}; - -function checkVisible(element) { - const rect = element.getBoundingClientRect(); - const viewBottom = Math.max(document.documentElement.clientHeight, window.innerHeight); - const viewTop = 30; - return !(rect.bottom < viewTop || rect.top >= viewBottom); -} - -function on_click(sel, fn) { - const elt = document.querySelector(sel); - if (elt) { - elt.addEventListener("click", fn); - } -} - -// Helpers for table sorting -function getCellValue(row, column = 0) { - const cell = row.cells[column] // nosemgrep: eslint.detect-object-injection - if (cell.childElementCount == 1) { - var child = cell.firstElementChild; - if (child.tagName === "A") { - child = child.firstElementChild; - } - if (child instanceof HTMLDataElement && child.value) { - return child.value; - } - } - return cell.innerText || cell.textContent; -} - -function rowComparator(rowA, rowB, column = 0) { - let valueA = getCellValue(rowA, column); - let valueB = getCellValue(rowB, column); - if (!isNaN(valueA) && !isNaN(valueB)) { - return valueA - valueB; - } - return valueA.localeCompare(valueB, undefined, {numeric: true}); -} - -function sortColumn(th) { - // Get the current sorting direction of the selected header, - // clear state on other headers and then set the new sorting direction. - const currentSortOrder = th.getAttribute("aria-sort"); - [...th.parentElement.cells].forEach(header => header.setAttribute("aria-sort", "none")); - var direction; - if (currentSortOrder === "none") { - direction = th.dataset.defaultSortOrder || "ascending"; - } - else if (currentSortOrder === "ascending") { - direction = "descending"; - } - else { - direction = "ascending"; - } - th.setAttribute("aria-sort", direction); - - const column = [...th.parentElement.cells].indexOf(th) - - // Sort all rows and afterwards append them in order to move them in the DOM. - Array.from(th.closest("table").querySelectorAll("tbody tr")) - .sort((rowA, rowB) => rowComparator(rowA, rowB, column) * (direction === "ascending" ? 1 : -1)) - .forEach(tr => tr.parentElement.appendChild(tr)); - - // Save the sort order for next time. - if (th.id !== "region") { - let th_id = "file"; // Sort by file if we don't have a column id - let current_direction = direction; - const stored_list = localStorage.getItem(coverage.INDEX_SORT_STORAGE); - if (stored_list) { - ({th_id, direction} = JSON.parse(stored_list)) - } - localStorage.setItem(coverage.INDEX_SORT_STORAGE, JSON.stringify({ - "th_id": th.id, - "direction": current_direction - })); - if (th.id !== th_id || document.getElementById("region")) { - // Sort column has changed, unset sorting by function or class. - localStorage.setItem(coverage.SORTED_BY_REGION, JSON.stringify({ - "by_region": false, - "region_direction": current_direction - })); - } - } - else { - // Sort column has changed to by function or class, remember that. - localStorage.setItem(coverage.SORTED_BY_REGION, JSON.stringify({ - "by_region": true, - "region_direction": direction - })); - } -} - -// Find all the elements with data-shortcut attribute, and use them to assign a shortcut key. -coverage.assign_shortkeys = function () { - document.querySelectorAll("[data-shortcut]").forEach(element => { - document.addEventListener("keypress", event => { - if (event.target.tagName.toLowerCase() === "input") { - return; // ignore keypress from search filter - } - if (event.key === element.dataset.shortcut) { - element.click(); - } - }); - }); -}; - -// Create the events for the filter box. -coverage.wire_up_filter = function () { - // Populate the filter and hide100 inputs if there are saved values for them. - const saved_filter_value = localStorage.getItem(coverage.FILTER_STORAGE); - if (saved_filter_value) { - document.getElementById("filter").value = saved_filter_value; - } - const saved_hide100_value = localStorage.getItem(coverage.HIDE100_STORAGE); - if (saved_hide100_value) { - document.getElementById("hide100").checked = JSON.parse(saved_hide100_value); - } - - // Cache elements. - const table = document.querySelector("table.index"); - const table_body_rows = table.querySelectorAll("tbody tr"); - const no_rows = document.getElementById("no_rows"); - - // Observe filter keyevents. - const filter_handler = (event => { - // Keep running total of each metric, first index contains number of shown rows - const totals = new Array(table.rows[0].cells.length).fill(0); - // Accumulate the percentage as fraction - totals[totals.length - 1] = { "numer": 0, "denom": 0 }; // nosemgrep: eslint.detect-object-injection - - var text = document.getElementById("filter").value; - // Store filter value - localStorage.setItem(coverage.FILTER_STORAGE, text); - const casefold = (text === text.toLowerCase()); - const hide100 = document.getElementById("hide100").checked; - // Store hide value. - localStorage.setItem(coverage.HIDE100_STORAGE, JSON.stringify(hide100)); - - // Hide / show elements. - table_body_rows.forEach(row => { - var show = false; - // Check the text filter. - for (let column = 0; column < totals.length; column++) { - cell = row.cells[column]; - if (cell.classList.contains("name")) { - var celltext = cell.textContent; - if (casefold) { - celltext = celltext.toLowerCase(); - } - if (celltext.includes(text)) { - show = true; - } - } - } - - // Check the "hide covered" filter. - if (show && hide100) { - const [numer, denom] = row.cells[row.cells.length - 1].dataset.ratio.split(" "); - show = (numer !== denom); - } - - if (!show) { - // hide - row.classList.add("hidden"); - return; - } - - // show - row.classList.remove("hidden"); - totals[0]++; - - for (let column = 0; column < totals.length; column++) { - // Accumulate dynamic totals - cell = row.cells[column] // nosemgrep: eslint.detect-object-injection - if (cell.classList.contains("name")) { - continue; - } - if (column === totals.length - 1) { - // Last column contains percentage - const [numer, denom] = cell.dataset.ratio.split(" "); - totals[column]["numer"] += parseInt(numer, 10); // nosemgrep: eslint.detect-object-injection - totals[column]["denom"] += parseInt(denom, 10); // nosemgrep: eslint.detect-object-injection - } - else { - totals[column] += parseInt(cell.textContent, 10); // nosemgrep: eslint.detect-object-injection - } - } - }); - - // Show placeholder if no rows will be displayed. - if (!totals[0]) { - // Show placeholder, hide table. - no_rows.style.display = "block"; - table.style.display = "none"; - return; - } - - // Hide placeholder, show table. - no_rows.style.display = null; - table.style.display = null; - - const footer = table.tFoot.rows[0]; - // Calculate new dynamic sum values based on visible rows. - for (let column = 0; column < totals.length; column++) { - // Get footer cell element. - const cell = footer.cells[column]; // nosemgrep: eslint.detect-object-injection - if (cell.classList.contains("name")) { - continue; - } - - // Set value into dynamic footer cell element. - if (column === totals.length - 1) { - // Percentage column uses the numerator and denominator, - // and adapts to the number of decimal places. - const match = /\.([0-9]+)/.exec(cell.textContent); - const places = match ? match[1].length : 0; - const { numer, denom } = totals[column]; // nosemgrep: eslint.detect-object-injection - cell.dataset.ratio = `${numer} ${denom}`; - // Check denom to prevent NaN if filtered files contain no statements - cell.textContent = denom - ? `${(numer * 100 / denom).toFixed(places)}%` - : `${(100).toFixed(places)}%`; - } - else { - cell.textContent = totals[column]; // nosemgrep: eslint.detect-object-injection - } - } - }); - - document.getElementById("filter").addEventListener("input", debounce(filter_handler)); - document.getElementById("hide100").addEventListener("input", debounce(filter_handler)); - - // Trigger change event on setup, to force filter on page refresh - // (filter value may still be present). - document.getElementById("filter").dispatchEvent(new Event("input")); - document.getElementById("hide100").dispatchEvent(new Event("input")); -}; -coverage.FILTER_STORAGE = "COVERAGE_FILTER_VALUE"; -coverage.HIDE100_STORAGE = "COVERAGE_HIDE100_VALUE"; - -// Set up the click-to-sort columns. -coverage.wire_up_sorting = function () { - document.querySelectorAll("[data-sortable] th[aria-sort]").forEach( - th => th.addEventListener("click", e => sortColumn(e.target)) - ); - - // Look for a localStorage item containing previous sort settings: - let th_id = "file", direction = "ascending"; - const stored_list = localStorage.getItem(coverage.INDEX_SORT_STORAGE); - if (stored_list) { - ({th_id, direction} = JSON.parse(stored_list)); - } - let by_region = false, region_direction = "ascending"; - const sorted_by_region = localStorage.getItem(coverage.SORTED_BY_REGION); - if (sorted_by_region) { - ({ - by_region, - region_direction - } = JSON.parse(sorted_by_region)); - } - - const region_id = "region"; - if (by_region && document.getElementById(region_id)) { - direction = region_direction; - } - // If we are in a page that has a column with id of "region", sort on - // it if the last sort was by function or class. - let th; - if (document.getElementById(region_id)) { - th = document.getElementById(by_region ? region_id : th_id); - } - else { - th = document.getElementById(th_id); - } - th.setAttribute("aria-sort", direction === "ascending" ? "descending" : "ascending"); - th.click() -}; - -coverage.INDEX_SORT_STORAGE = "COVERAGE_INDEX_SORT_2"; -coverage.SORTED_BY_REGION = "COVERAGE_SORT_REGION"; - -// Loaded on index.html -coverage.index_ready = function () { - coverage.assign_shortkeys(); - coverage.wire_up_filter(); - coverage.wire_up_sorting(); - - on_click(".button_prev_file", coverage.to_prev_file); - on_click(".button_next_file", coverage.to_next_file); - - on_click(".button_show_hide_help", coverage.show_hide_help); -}; - -// -- pyfile stuff -- - -coverage.LINE_FILTERS_STORAGE = "COVERAGE_LINE_FILTERS"; - -coverage.pyfile_ready = function () { - // If we're directed to a particular line number, highlight the line. - var frag = location.hash; - if (frag.length > 2 && frag[1] === "t") { - document.querySelector(frag).closest(".n").classList.add("highlight"); - coverage.set_sel(parseInt(frag.substr(2), 10)); - } - else { - coverage.set_sel(0); - } - - on_click(".button_toggle_run", coverage.toggle_lines); - on_click(".button_toggle_mis", coverage.toggle_lines); - on_click(".button_toggle_exc", coverage.toggle_lines); - on_click(".button_toggle_par", coverage.toggle_lines); - - on_click(".button_next_chunk", coverage.to_next_chunk_nicely); - on_click(".button_prev_chunk", coverage.to_prev_chunk_nicely); - on_click(".button_top_of_page", coverage.to_top); - on_click(".button_first_chunk", coverage.to_first_chunk); - - on_click(".button_prev_file", coverage.to_prev_file); - on_click(".button_next_file", coverage.to_next_file); - on_click(".button_to_index", coverage.to_index); - - on_click(".button_show_hide_help", coverage.show_hide_help); - - coverage.filters = undefined; - try { - coverage.filters = localStorage.getItem(coverage.LINE_FILTERS_STORAGE); - } catch(err) {} - - if (coverage.filters) { - coverage.filters = JSON.parse(coverage.filters); - } - else { - coverage.filters = {run: false, exc: true, mis: true, par: true}; - } - - for (cls in coverage.filters) { - coverage.set_line_visibilty(cls, coverage.filters[cls]); // nosemgrep: eslint.detect-object-injection - } - - coverage.assign_shortkeys(); - coverage.init_scroll_markers(); - coverage.wire_up_sticky_header(); - - document.querySelectorAll("[id^=ctxs]").forEach( - cbox => cbox.addEventListener("click", coverage.expand_contexts) - ); - - // Rebuild scroll markers when the window height changes. - window.addEventListener("resize", coverage.build_scroll_markers); -}; - -coverage.toggle_lines = function (event) { - const btn = event.target.closest("button"); - const category = btn.value - const show = !btn.classList.contains("show_" + category); - coverage.set_line_visibilty(category, show); - coverage.build_scroll_markers(); - coverage.filters[category] = show; - try { - localStorage.setItem(coverage.LINE_FILTERS_STORAGE, JSON.stringify(coverage.filters)); - } catch(err) {} -}; - -coverage.set_line_visibilty = function (category, should_show) { - const cls = "show_" + category; - const btn = document.querySelector(".button_toggle_" + category); - if (btn) { - if (should_show) { - document.querySelectorAll("#source ." + category).forEach(e => e.classList.add(cls)); - btn.classList.add(cls); - } - else { - document.querySelectorAll("#source ." + category).forEach(e => e.classList.remove(cls)); - btn.classList.remove(cls); - } - } -}; - -// Return the nth line div. -coverage.line_elt = function (n) { - return document.getElementById("t" + n)?.closest("p"); -}; - -// Set the selection. b and e are line numbers. -coverage.set_sel = function (b, e) { - // The first line selected. - coverage.sel_begin = b; - // The next line not selected. - coverage.sel_end = (e === undefined) ? b+1 : e; -}; - -coverage.to_top = function () { - coverage.set_sel(0, 1); - coverage.scroll_window(0); -}; - -coverage.to_first_chunk = function () { - coverage.set_sel(0, 1); - coverage.to_next_chunk(); -}; - -coverage.to_prev_file = function () { - window.location = document.getElementById("prevFileLink").href; -} - -coverage.to_next_file = function () { - window.location = document.getElementById("nextFileLink").href; -} - -coverage.to_index = function () { - location.href = document.getElementById("indexLink").href; -} - -coverage.show_hide_help = function () { - const helpCheck = document.getElementById("help_panel_state") - helpCheck.checked = !helpCheck.checked; -} - -// Return a string indicating what kind of chunk this line belongs to, -// or null if not a chunk. -coverage.chunk_indicator = function (line_elt) { - const classes = line_elt?.className; - if (!classes) { - return null; - } - const match = classes.match(/\bshow_\w+\b/); - if (!match) { - return null; - } - return match[0]; -}; - -coverage.to_next_chunk = function () { - const c = coverage; - - // Find the start of the next colored chunk. - var probe = c.sel_end; - var chunk_indicator, probe_line; - while (true) { - probe_line = c.line_elt(probe); - if (!probe_line) { - return; - } - chunk_indicator = c.chunk_indicator(probe_line); - if (chunk_indicator) { - break; - } - probe++; - } - - // There's a next chunk, `probe` points to it. - var begin = probe; - - // Find the end of this chunk. - var next_indicator = chunk_indicator; - while (next_indicator === chunk_indicator) { - probe++; - probe_line = c.line_elt(probe); - next_indicator = c.chunk_indicator(probe_line); - } - c.set_sel(begin, probe); - c.show_selection(); -}; - -coverage.to_prev_chunk = function () { - const c = coverage; - - // Find the end of the prev colored chunk. - var probe = c.sel_begin-1; - var probe_line = c.line_elt(probe); - if (!probe_line) { - return; - } - var chunk_indicator = c.chunk_indicator(probe_line); - while (probe > 1 && !chunk_indicator) { - probe--; - probe_line = c.line_elt(probe); - if (!probe_line) { - return; - } - chunk_indicator = c.chunk_indicator(probe_line); - } - - // There's a prev chunk, `probe` points to its last line. - var end = probe+1; - - // Find the beginning of this chunk. - var prev_indicator = chunk_indicator; - while (prev_indicator === chunk_indicator) { - probe--; - if (probe <= 0) { - return; - } - probe_line = c.line_elt(probe); - prev_indicator = c.chunk_indicator(probe_line); - } - c.set_sel(probe+1, end); - c.show_selection(); -}; - -// Returns 0, 1, or 2: how many of the two ends of the selection are on -// the screen right now? -coverage.selection_ends_on_screen = function () { - if (coverage.sel_begin === 0) { - return 0; - } - - const begin = coverage.line_elt(coverage.sel_begin); - const end = coverage.line_elt(coverage.sel_end-1); - - return ( - (checkVisible(begin) ? 1 : 0) - + (checkVisible(end) ? 1 : 0) - ); -}; - -coverage.to_next_chunk_nicely = function () { - if (coverage.selection_ends_on_screen() === 0) { - // The selection is entirely off the screen: - // Set the top line on the screen as selection. - - // This will select the top-left of the viewport - // As this is most likely the span with the line number we take the parent - const line = document.elementFromPoint(0, 0).parentElement; - if (line.parentElement !== document.getElementById("source")) { - // The element is not a source line but the header or similar - coverage.select_line_or_chunk(1); - } - else { - // We extract the line number from the id - coverage.select_line_or_chunk(parseInt(line.id.substring(1), 10)); - } - } - coverage.to_next_chunk(); -}; - -coverage.to_prev_chunk_nicely = function () { - if (coverage.selection_ends_on_screen() === 0) { - // The selection is entirely off the screen: - // Set the lowest line on the screen as selection. - - // This will select the bottom-left of the viewport - // As this is most likely the span with the line number we take the parent - const line = document.elementFromPoint(document.documentElement.clientHeight-1, 0).parentElement; - if (line.parentElement !== document.getElementById("source")) { - // The element is not a source line but the header or similar - coverage.select_line_or_chunk(coverage.lines_len); - } - else { - // We extract the line number from the id - coverage.select_line_or_chunk(parseInt(line.id.substring(1), 10)); - } - } - coverage.to_prev_chunk(); -}; - -// Select line number lineno, or if it is in a colored chunk, select the -// entire chunk -coverage.select_line_or_chunk = function (lineno) { - var c = coverage; - var probe_line = c.line_elt(lineno); - if (!probe_line) { - return; - } - var the_indicator = c.chunk_indicator(probe_line); - if (the_indicator) { - // The line is in a highlighted chunk. - // Search backward for the first line. - var probe = lineno; - var indicator = the_indicator; - while (probe > 0 && indicator === the_indicator) { - probe--; - probe_line = c.line_elt(probe); - if (!probe_line) { - break; - } - indicator = c.chunk_indicator(probe_line); - } - var begin = probe + 1; - - // Search forward for the last line. - probe = lineno; - indicator = the_indicator; - while (indicator === the_indicator) { - probe++; - probe_line = c.line_elt(probe); - indicator = c.chunk_indicator(probe_line); - } - - coverage.set_sel(begin, probe); - } - else { - coverage.set_sel(lineno); - } -}; - -coverage.show_selection = function () { - // Highlight the lines in the chunk - document.querySelectorAll("#source .highlight").forEach(e => e.classList.remove("highlight")); - for (let probe = coverage.sel_begin; probe < coverage.sel_end; probe++) { - coverage.line_elt(probe).querySelector(".n").classList.add("highlight"); - } - - coverage.scroll_to_selection(); -}; - -coverage.scroll_to_selection = function () { - // Scroll the page if the chunk isn't fully visible. - if (coverage.selection_ends_on_screen() < 2) { - const element = coverage.line_elt(coverage.sel_begin); - coverage.scroll_window(element.offsetTop - 60); - } -}; - -coverage.scroll_window = function (to_pos) { - window.scroll({top: to_pos, behavior: "smooth"}); -}; - -coverage.init_scroll_markers = function () { - // Init some variables - coverage.lines_len = document.querySelectorAll("#source > p").length; - - // Build html - coverage.build_scroll_markers(); -}; - -coverage.build_scroll_markers = function () { - const temp_scroll_marker = document.getElementById("scroll_marker") - if (temp_scroll_marker) temp_scroll_marker.remove(); - // Don't build markers if the window has no scroll bar. - if (document.body.scrollHeight <= window.innerHeight) { - return; - } - - const marker_scale = window.innerHeight / document.body.scrollHeight; - const line_height = Math.min(Math.max(3, window.innerHeight / coverage.lines_len), 10); - - let previous_line = -99, last_mark, last_top; - - const scroll_marker = document.createElement("div"); - scroll_marker.id = "scroll_marker"; - document.getElementById("source").querySelectorAll( - "p.show_run, p.show_mis, p.show_exc, p.show_exc, p.show_par" - ).forEach(element => { - const line_top = Math.floor(element.offsetTop * marker_scale); - const line_number = parseInt(element.querySelector(".n a").id.substr(1)); - - if (line_number === previous_line + 1) { - // If this solid missed block just make previous mark higher. - last_mark.style.height = `${line_top + line_height - last_top}px`; - } - else { - // Add colored line in scroll_marker block. - last_mark = document.createElement("div"); - last_mark.id = `m${line_number}`; - last_mark.classList.add("marker"); - last_mark.style.height = `${line_height}px`; - last_mark.style.top = `${line_top}px`; - scroll_marker.append(last_mark); - last_top = line_top; - } - - previous_line = line_number; - }); - - // Append last to prevent layout calculation - document.body.append(scroll_marker); -}; - -coverage.wire_up_sticky_header = function () { - const header = document.querySelector("header"); - const header_bottom = ( - header.querySelector(".content h2").getBoundingClientRect().top - - header.getBoundingClientRect().top - ); - - function updateHeader() { - if (window.scrollY > header_bottom) { - header.classList.add("sticky"); - } - else { - header.classList.remove("sticky"); - } - } - - window.addEventListener("scroll", updateHeader); - updateHeader(); -}; - -coverage.expand_contexts = function (e) { - var ctxs = e.target.parentNode.querySelector(".ctxs"); - - if (!ctxs.classList.contains("expanded")) { - var ctxs_text = ctxs.textContent; - var width = Number(ctxs_text[0]); - ctxs.textContent = ""; - for (var i = 1; i < ctxs_text.length; i += width) { - key = ctxs_text.substring(i, i + width).trim(); - ctxs.appendChild(document.createTextNode(contexts[key])); - ctxs.appendChild(document.createElement("br")); - } - ctxs.classList.add("expanded"); - } -}; - -document.addEventListener("DOMContentLoaded", () => { - if (document.body.classList.contains("indexfile")) { - coverage.index_ready(); - } - else { - coverage.pyfile_ready(); - } -}); diff --git a/2.2.2/tests/favicon_32_cb_58284776.png b/2.2.2/tests/favicon_32_cb_58284776.png deleted file mode 100644 index 8649f04..0000000 Binary files a/2.2.2/tests/favicon_32_cb_58284776.png and /dev/null differ diff --git a/2.2.2/tests/function_index.html b/2.2.2/tests/function_index.html deleted file mode 100644 index 8d68886..0000000 --- a/2.2.2/tests/function_index.html +++ /dev/null @@ -1,1203 +0,0 @@ - - - - - Coverage report - - - - - -
-
-

Coverage report: - 84% -

- -
- -
- - -
-
-

- Files - Functions - Classes -

-

- coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

-
-
-
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Filefunctionstatementsmissingexcludedcoverage
src/rok4/enums.py(no function)1700100%
src/rok4/exceptions.pyMissingAttributeError.__init__300100%
src/rok4/exceptions.pyMissingEnvironmentError.__init__200100%
src/rok4/exceptions.pyStorageError.__init__300100%
src/rok4/exceptions.pyFormatError.__init__400100%
src/rok4/exceptions.py(no function)800100%
src/rok4/layer.pyLayer.from_descriptor307077%
src/rok4/layer.pyLayer.from_parameters225077%
src/rok4/layer.pyLayer.__init__600100%
src/rok4/layer.pyLayer.__load_pyramids274085%
src/rok4/layer.pyLayer.__str__1100%
src/rok4/layer.pyLayer.serializable101090%
src/rok4/layer.pyLayer.write_descriptor41075%
src/rok4/layer.pyLayer.type300100%
src/rok4/layer.pyLayer.bbox1100%
src/rok4/layer.pyLayer.geobbox100100%
src/rok4/layer.py(no function)2700100%
src/rok4/pyramid.pyb36_number_encode800100%
src/rok4/pyramid.pyb36_number_decode100100%
src/rok4/pyramid.pyb36_path_decode900100%
src/rok4/pyramid.pyb36_path_encode1500100%
src/rok4/pyramid.pyLevel.from_descriptor236074%
src/rok4/pyramid.pyLevel.from_other800100%
src/rok4/pyramid.pyLevel.__str__1100%
src/rok4/pyramid.pyLevel.serializable1610038%
src/rok4/pyramid.pyLevel.id100100%
src/rok4/pyramid.pyLevel.bbox3300%
src/rok4/pyramid.pyLevel.resolution100100%
src/rok4/pyramid.pyLevel.tile_matrix100100%
src/rok4/pyramid.pyLevel.slab_width100100%
src/rok4/pyramid.pyLevel.slab_height100100%
src/rok4/pyramid.pyLevel.tile_limits1100%
src/rok4/pyramid.pyLevel.is_in_limits100100%
src/rok4/pyramid.pyLevel.set_limits_from_bbox2200%
src/rok4/pyramid.pyPyramid.from_descriptor272093%
src/rok4/pyramid.pyPyramid.from_other285082%
src/rok4/pyramid.pyPyramid.__init__400100%
src/rok4/pyramid.pyPyramid.__str__1100%
src/rok4/pyramid.pyPyramid.serializable101090%
src/rok4/pyramid.pyPyramid.list1100%
src/rok4/pyramid.pyPyramid.descriptor100100%
src/rok4/pyramid.pyPyramid.name100100%
src/rok4/pyramid.pyPyramid.tms100100%
src/rok4/pyramid.pyPyramid.raster_specifications1100%
src/rok4/pyramid.pyPyramid.storage_type100100%
src/rok4/pyramid.pyPyramid.storage_root100100%
src/rok4/pyramid.pyPyramid.storage_depth100100%
src/rok4/pyramid.pyPyramid.storage_s3_cluster64033%
src/rok4/pyramid.pyPyramid.storage_depth31067%
src/rok4/pyramid.pyPyramid.own_masks100100%
src/rok4/pyramid.pyPyramid.format1100%
src/rok4/pyramid.pyPyramid.channels1100%
src/rok4/pyramid.pyPyramid.tile_extension93067%
src/rok4/pyramid.pyPyramid.bottom_level100100%
src/rok4/pyramid.pyPyramid.top_level1100%
src/rok4/pyramid.pyPyramid.type300100%
src/rok4/pyramid.pyPyramid.load_list700100%
src/rok4/pyramid.pyPyramid.list_generator377081%
src/rok4/pyramid.pyPyramid.get_level100100%
src/rok4/pyramid.pyPyramid.get_levels248067%
src/rok4/pyramid.pyPyramid.write_descriptor200100%
src/rok4/pyramid.pyPyramid.get_infos_from_slab_path2011045%
src/rok4/pyramid.pyPyramid.get_slab_path_from_infos62067%
src/rok4/pyramid.pyPyramid.get_tile_data_binary224082%
src/rok4/pyramid.pyPyramid.get_tile_data_raster4026035%
src/rok4/pyramid.pyPyramid.get_tile_data_vector134069%
src/rok4/pyramid.pyPyramid.get_tile_indices92078%
src/rok4/pyramid.pyPyramid.delete_level4400%
src/rok4/pyramid.pyPyramid.add_level9900%
src/rok4/pyramid.pyPyramid.size3300%
src/rok4/pyramid.py(no function)10300100%
src/rok4/raster.pyRaster.__init__600100%
src/rok4/raster.pyRaster.from_file2100100%
src/rok4/raster.pyRaster.from_parameters800100%
src/rok4/raster.pyRasterSet.__init__400100%
src/rok4/raster.pyRasterSet.from_list292093%
src/rok4/raster.pyRasterSet.from_descriptor172088%
src/rok4/raster.pyRasterSet.serializable1000100%
src/rok4/raster.pyRasterSet.write_descriptor400100%
src/rok4/raster.py(no function)2700100%
src/rok4/storage.py__get_ttl_hash31067%
src/rok4/storage.py__get_s3_client292093%
src/rok4/storage.pydisconnect_s3_clients200100%
src/rok4/storage.py__get_ceph_ioctx144071%
src/rok4/storage.pydisconnect_ceph_clients200100%
src/rok4/storage.pyget_infos_from_path131092%
src/rok4/storage.pyget_path_from_infos100100%
src/rok4/storage.pyhash_file700100%
src/rok4/storage.pyget_data_str100100%
src/rok4/storage.py__get_cached_data_binary4813073%
src/rok4/storage.pyget_data_binary100100%
src/rok4/storage.pyput_data_str2110052%
src/rok4/storage.pyget_size289068%
src/rok4/storage.pyexists306080%
src/rok4/storage.pyremove239061%
src/rok4/storage.pycopy17329083%
src/rok4/storage.pylink348076%
src/rok4/storage.pyget_osgeo_path131092%
src/rok4/storage.pysize_path252092%
src/rok4/storage.py(no function)7012083%
src/rok4/style.pyColour.__init__185072%
src/rok4/style.pyColour.rgba100100%
src/rok4/style.pyColour.rgb100100%
src/rok4/style.pyPalette.__init__131092%
src/rok4/style.pyPalette.convert232091%
src/rok4/style.pySlope.__init__82075%
src/rok4/style.pyExposition.__init__72071%
src/rok4/style.pyEstompage.__init__82075%
src/rok4/style.pyLegend.__init__92078%
src/rok4/style.pyStyle.__init__3200100%
src/rok4/style.pyStyle.bands700100%
src/rok4/style.pyStyle.format500100%
src/rok4/style.pyStyle.input_nodata91089%
src/rok4/style.pyStyle.is_identity100100%
src/rok4/style.py(no function)3500100%
src/rok4/tile_matrix_set.pyTileMatrix.__init__1200100%
src/rok4/tile_matrix_set.pyTileMatrix.x_to_column100100%
src/rok4/tile_matrix_set.pyTileMatrix.y_to_row100100%
src/rok4/tile_matrix_set.pyTileMatrix.tile_to_bbox300100%
src/rok4/tile_matrix_set.pyTileMatrix.bbox_to_tiles300100%
src/rok4/tile_matrix_set.pyTileMatrix.point_to_indices600100%
src/rok4/tile_matrix_set.pyTileMatrix.tile_width1100%
src/rok4/tile_matrix_set.pyTileMatrix.tile_heigth1100%
src/rok4/tile_matrix_set.pyTileMatrixSet.__init__2400100%
src/rok4/tile_matrix_set.pyTileMatrixSet.get_level100100%
src/rok4/tile_matrix_set.pyTileMatrixSet.sorted_levels1100%
src/rok4/tile_matrix_set.py(no function)2300100%
src/rok4/utils.pysrs_to_spatialreference800100%
src/rok4/utils.pybbox_to_geometry2200100%
src/rok4/utils.pyreproject_bbox151093%
src/rok4/utils.pyreproject_point91089%
src/rok4/utils.pycompute_bbox131092%
src/rok4/utils.pycompute_format2100100%
src/rok4/utils.py(no function)1500100%
src/rok4/vector.pyVector.from_file8512086%
src/rok4/vector.pyVector.from_parameters500100%
src/rok4/vector.py(no function)1000100%
Total 1787291084%
-

- No items found using the specified filter. -

-
- - - diff --git a/2.2.2/tests/index.html b/2.2.2/tests/index.html deleted file mode 100644 index de85c46..0000000 --- a/2.2.2/tests/index.html +++ /dev/null @@ -1,174 +0,0 @@ - - - - - Coverage report - - - - - -
-
-

Coverage report: - 84% -

- -
- -
- - -
-
-

- Files - Functions - Classes -

-

- coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

-
-
-
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Filestatementsmissingexcludedcoverage
src/rok4/enums.py1700100%
src/rok4/exceptions.py2000100%
src/rok4/layer.py13220085%
src/rok4/pyramid.py497125075%
src/rok4/raster.py1264097%
src/rok4/storage.py538107080%
src/rok4/style.py17717090%
src/rok4/tile_matrix_set.py773096%
src/rok4/utils.py1033097%
src/rok4/vector.py10012088%
Total1787291084%
-

- No items found using the specified filter. -

-
- - - diff --git a/2.2.2/tests/keybd_closed_cb_ce680311.png b/2.2.2/tests/keybd_closed_cb_ce680311.png deleted file mode 100644 index ba119c4..0000000 Binary files a/2.2.2/tests/keybd_closed_cb_ce680311.png and /dev/null differ diff --git a/2.2.2/tests/status.json b/2.2.2/tests/status.json deleted file mode 100644 index 0eb0dde..0000000 --- a/2.2.2/tests/status.json +++ /dev/null @@ -1 +0,0 @@ -{"note":"This file is an internal implementation detail to speed up HTML report generation. Its format can change at any time. You might be looking for the JSON report: https://coverage.rtfd.io/cmd.html#cmd-json","format":5,"version":"7.6.1","globals":"8f5805e051ada8ce20cfff8cee22fa3c","files":{"z_4cdda0aa429327c0_enums_py":{"hash":"1263640a4376119c8da03a5861a1685e","index":{"url":"z_4cdda0aa429327c0_enums_py.html","file":"src/rok4/enums.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":17,"n_excluded":0,"n_missing":0,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_exceptions_py":{"hash":"13cd3d3654b2e0e5c461b54fdd0068b0","index":{"url":"z_4cdda0aa429327c0_exceptions_py.html","file":"src/rok4/exceptions.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":20,"n_excluded":0,"n_missing":0,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_layer_py":{"hash":"435d6ddcaf69a3243aed01c8b691d14b","index":{"url":"z_4cdda0aa429327c0_layer_py.html","file":"src/rok4/layer.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":132,"n_excluded":0,"n_missing":20,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_pyramid_py":{"hash":"30b4cf07ac73e190a0f3232b377699a8","index":{"url":"z_4cdda0aa429327c0_pyramid_py.html","file":"src/rok4/pyramid.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":497,"n_excluded":0,"n_missing":125,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_raster_py":{"hash":"507790295f245a4cdbb3c28019fba3b6","index":{"url":"z_4cdda0aa429327c0_raster_py.html","file":"src/rok4/raster.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":126,"n_excluded":0,"n_missing":4,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_storage_py":{"hash":"a0078ab7cb4a964fadc4687913825a07","index":{"url":"z_4cdda0aa429327c0_storage_py.html","file":"src/rok4/storage.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":538,"n_excluded":0,"n_missing":107,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_style_py":{"hash":"db42657da0567c565adbc4c1d5ad28b4","index":{"url":"z_4cdda0aa429327c0_style_py.html","file":"src/rok4/style.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":177,"n_excluded":0,"n_missing":17,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_tile_matrix_set_py":{"hash":"6e47ae4bb31159809f7a9c81ae217849","index":{"url":"z_4cdda0aa429327c0_tile_matrix_set_py.html","file":"src/rok4/tile_matrix_set.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":77,"n_excluded":0,"n_missing":3,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_utils_py":{"hash":"b0e9c3a0b3484069f6b48e75b5eaec2f","index":{"url":"z_4cdda0aa429327c0_utils_py.html","file":"src/rok4/utils.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":103,"n_excluded":0,"n_missing":3,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}},"z_4cdda0aa429327c0_vector_py":{"hash":"84111e3c74f07818018a6bc85579fa45","index":{"url":"z_4cdda0aa429327c0_vector_py.html","file":"src/rok4/vector.py","description":"","nums":{"precision":0,"n_files":1,"n_statements":100,"n_excluded":0,"n_missing":12,"n_branches":0,"n_partial_branches":0,"n_missing_branches":0}}}}} \ No newline at end of file diff --git a/2.2.2/tests/style_cb_8e611ae1.css b/2.2.2/tests/style_cb_8e611ae1.css deleted file mode 100644 index 3cdaf05..0000000 --- a/2.2.2/tests/style_cb_8e611ae1.css +++ /dev/null @@ -1,337 +0,0 @@ -@charset "UTF-8"; -/* Licensed under the Apache License: http://www.apache.org/licenses/LICENSE-2.0 */ -/* For details: https://github.com/nedbat/coveragepy/blob/master/NOTICE.txt */ -/* Don't edit this .css file. Edit the .scss file instead! */ -html, body, h1, h2, h3, p, table, td, th { margin: 0; padding: 0; border: 0; font-weight: inherit; font-style: inherit; font-size: 100%; font-family: inherit; vertical-align: baseline; } - -body { font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Ubuntu, Cantarell, "Helvetica Neue", sans-serif; font-size: 1em; background: #fff; color: #000; } - -@media (prefers-color-scheme: dark) { body { background: #1e1e1e; } } - -@media (prefers-color-scheme: dark) { body { color: #eee; } } - -html > body { font-size: 16px; } - -a:active, a:focus { outline: 2px dashed #007acc; } - -p { font-size: .875em; line-height: 1.4em; } - -table { border-collapse: collapse; } - -td { vertical-align: top; } - -table tr.hidden { display: none !important; } - -p#no_rows { display: none; font-size: 1.15em; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Ubuntu, Cantarell, "Helvetica Neue", sans-serif; } - -a.nav { text-decoration: none; color: inherit; } - -a.nav:hover { text-decoration: underline; color: inherit; } - -.hidden { display: none; } - -header { background: #f8f8f8; width: 100%; z-index: 2; border-bottom: 1px solid #ccc; } - -@media (prefers-color-scheme: dark) { header { background: black; } } - -@media (prefers-color-scheme: dark) { header { border-color: #333; } } - -header .content { padding: 1rem 3.5rem; } - -header h2 { margin-top: .5em; font-size: 1em; } - -header h2 a.button { font-family: inherit; font-size: inherit; border: 1px solid; border-radius: .2em; background: #eee; color: inherit; text-decoration: none; padding: .1em .5em; margin: 1px calc(.1em + 1px); cursor: pointer; border-color: #ccc; } - -@media (prefers-color-scheme: dark) { header h2 a.button { background: #333; } } - -@media (prefers-color-scheme: dark) { header h2 a.button { border-color: #444; } } - -header h2 a.button.current { border: 2px solid; background: #fff; border-color: #999; cursor: default; } - -@media (prefers-color-scheme: dark) { header h2 a.button.current { background: #1e1e1e; } } - -@media (prefers-color-scheme: dark) { header h2 a.button.current { border-color: #777; } } - -header p.text { margin: .5em 0 -.5em; color: #666; font-style: italic; } - -@media (prefers-color-scheme: dark) { header p.text { color: #aaa; } } - -header.sticky { position: fixed; left: 0; right: 0; height: 2.5em; } - -header.sticky .text { display: none; } - -header.sticky h1, header.sticky h2 { font-size: 1em; margin-top: 0; display: inline-block; } - -header.sticky .content { padding: 0.5rem 3.5rem; } - -header.sticky .content p { font-size: 1em; } - -header.sticky ~ #source { padding-top: 6.5em; } - -main { position: relative; z-index: 1; } - -footer { margin: 1rem 3.5rem; } - -footer .content { padding: 0; color: #666; font-style: italic; } - -@media (prefers-color-scheme: dark) { footer .content { color: #aaa; } } - -#index { margin: 1rem 0 0 3.5rem; } - -h1 { font-size: 1.25em; display: inline-block; } - -#filter_container { float: right; margin: 0 2em 0 0; line-height: 1.66em; } - -#filter_container #filter { width: 10em; padding: 0.2em 0.5em; border: 2px solid #ccc; background: #fff; color: #000; } - -@media (prefers-color-scheme: dark) { #filter_container #filter { border-color: #444; } } - -@media (prefers-color-scheme: dark) { #filter_container #filter { background: #1e1e1e; } } - -@media (prefers-color-scheme: dark) { #filter_container #filter { color: #eee; } } - -#filter_container #filter:focus { border-color: #007acc; } - -#filter_container :disabled ~ label { color: #ccc; } - -@media (prefers-color-scheme: dark) { #filter_container :disabled ~ label { color: #444; } } - -#filter_container label { font-size: .875em; color: #666; } - -@media (prefers-color-scheme: dark) { #filter_container label { color: #aaa; } } - -header button { font-family: inherit; font-size: inherit; border: 1px solid; border-radius: .2em; background: #eee; color: inherit; text-decoration: none; padding: .1em .5em; margin: 1px calc(.1em + 1px); cursor: pointer; border-color: #ccc; } - -@media (prefers-color-scheme: dark) { header button { background: #333; } } - -@media (prefers-color-scheme: dark) { header button { border-color: #444; } } - -header button:active, header button:focus { outline: 2px dashed #007acc; } - -header button.run { background: #eeffee; } - -@media (prefers-color-scheme: dark) { header button.run { background: #373d29; } } - -header button.run.show_run { background: #dfd; border: 2px solid #00dd00; margin: 0 .1em; } - -@media (prefers-color-scheme: dark) { header button.run.show_run { background: #373d29; } } - -header button.mis { background: #ffeeee; } - -@media (prefers-color-scheme: dark) { header button.mis { background: #4b1818; } } - -header button.mis.show_mis { background: #fdd; border: 2px solid #ff0000; margin: 0 .1em; } - -@media (prefers-color-scheme: dark) { header button.mis.show_mis { background: #4b1818; } } - -header button.exc { background: #f7f7f7; } - -@media (prefers-color-scheme: dark) { header button.exc { background: #333; } } - -header button.exc.show_exc { background: #eee; border: 2px solid #808080; margin: 0 .1em; } - -@media (prefers-color-scheme: dark) { header button.exc.show_exc { background: #333; } } - -header button.par { background: #ffffd5; } - -@media (prefers-color-scheme: dark) { header button.par { background: #650; } } - -header button.par.show_par { background: #ffa; border: 2px solid #bbbb00; margin: 0 .1em; } - -@media (prefers-color-scheme: dark) { header button.par.show_par { background: #650; } } - -#help_panel, #source p .annotate.long { display: none; position: absolute; z-index: 999; background: #ffffcc; border: 1px solid #888; border-radius: .2em; color: #333; padding: .25em .5em; } - -#source p .annotate.long { white-space: normal; float: right; top: 1.75em; right: 1em; height: auto; } - -#help_panel_wrapper { float: right; position: relative; } - -#keyboard_icon { margin: 5px; } - -#help_panel_state { display: none; } - -#help_panel { top: 25px; right: 0; padding: .75em; border: 1px solid #883; color: #333; } - -#help_panel .keyhelp p { margin-top: .75em; } - -#help_panel .legend { font-style: italic; margin-bottom: 1em; } - -.indexfile #help_panel { width: 25em; } - -.pyfile #help_panel { width: 18em; } - -#help_panel_state:checked ~ #help_panel { display: block; } - -kbd { border: 1px solid black; border-color: #888 #333 #333 #888; padding: .1em .35em; font-family: SFMono-Regular, Menlo, Monaco, Consolas, monospace; font-weight: bold; background: #eee; border-radius: 3px; } - -#source { padding: 1em 0 1em 3.5rem; font-family: SFMono-Regular, Menlo, Monaco, Consolas, monospace; } - -#source p { position: relative; white-space: pre; } - -#source p * { box-sizing: border-box; } - -#source p .n { float: left; text-align: right; width: 3.5rem; box-sizing: border-box; margin-left: -3.5rem; padding-right: 1em; color: #999; user-select: none; } - -@media (prefers-color-scheme: dark) { #source p .n { color: #777; } } - -#source p .n.highlight { background: #ffdd00; } - -#source p .n a { scroll-margin-top: 6em; text-decoration: none; color: #999; } - -@media (prefers-color-scheme: dark) { #source p .n a { color: #777; } } - -#source p .n a:hover { text-decoration: underline; color: #999; } - -@media (prefers-color-scheme: dark) { #source p .n a:hover { color: #777; } } - -#source p .t { display: inline-block; width: 100%; box-sizing: border-box; margin-left: -.5em; padding-left: 0.3em; border-left: 0.2em solid #fff; } - -@media (prefers-color-scheme: dark) { #source p .t { border-color: #1e1e1e; } } - -#source p .t:hover { background: #f2f2f2; } - -@media (prefers-color-scheme: dark) { #source p .t:hover { background: #282828; } } - -#source p .t:hover ~ .r .annotate.long { display: block; } - -#source p .t .com { color: #008000; font-style: italic; line-height: 1px; } - -@media (prefers-color-scheme: dark) { #source p .t .com { color: #6a9955; } } - -#source p .t .key { font-weight: bold; line-height: 1px; } - -#source p .t .str { color: #0451a5; } - -@media (prefers-color-scheme: dark) { #source p .t .str { color: #9cdcfe; } } - -#source p.mis .t { border-left: 0.2em solid #ff0000; } - -#source p.mis.show_mis .t { background: #fdd; } - -@media (prefers-color-scheme: dark) { #source p.mis.show_mis .t { background: #4b1818; } } - -#source p.mis.show_mis .t:hover { background: #f2d2d2; } - -@media (prefers-color-scheme: dark) { #source p.mis.show_mis .t:hover { background: #532323; } } - -#source p.run .t { border-left: 0.2em solid #00dd00; } - -#source p.run.show_run .t { background: #dfd; } - -@media (prefers-color-scheme: dark) { #source p.run.show_run .t { background: #373d29; } } - -#source p.run.show_run .t:hover { background: #d2f2d2; } - -@media (prefers-color-scheme: dark) { #source p.run.show_run .t:hover { background: #404633; } } - -#source p.exc .t { border-left: 0.2em solid #808080; } - -#source p.exc.show_exc .t { background: #eee; } - -@media (prefers-color-scheme: dark) { #source p.exc.show_exc .t { background: #333; } } - -#source p.exc.show_exc .t:hover { background: #e2e2e2; } - -@media (prefers-color-scheme: dark) { #source p.exc.show_exc .t:hover { background: #3c3c3c; } } - -#source p.par .t { border-left: 0.2em solid #bbbb00; } - -#source p.par.show_par .t { background: #ffa; } - -@media (prefers-color-scheme: dark) { #source p.par.show_par .t { background: #650; } } - -#source p.par.show_par .t:hover { background: #f2f2a2; } - -@media (prefers-color-scheme: dark) { #source p.par.show_par .t:hover { background: #6d5d0c; } } - -#source p .r { position: absolute; top: 0; right: 2.5em; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Ubuntu, Cantarell, "Helvetica Neue", sans-serif; } - -#source p .annotate { font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Ubuntu, Cantarell, "Helvetica Neue", sans-serif; color: #666; padding-right: .5em; } - -@media (prefers-color-scheme: dark) { #source p .annotate { color: #ddd; } } - -#source p .annotate.short:hover ~ .long { display: block; } - -#source p .annotate.long { width: 30em; right: 2.5em; } - -#source p input { display: none; } - -#source p input ~ .r label.ctx { cursor: pointer; border-radius: .25em; } - -#source p input ~ .r label.ctx::before { content: "▶ "; } - -#source p input ~ .r label.ctx:hover { background: #e8f4ff; color: #666; } - -@media (prefers-color-scheme: dark) { #source p input ~ .r label.ctx:hover { background: #0f3a42; } } - -@media (prefers-color-scheme: dark) { #source p input ~ .r label.ctx:hover { color: #aaa; } } - -#source p input:checked ~ .r label.ctx { background: #d0e8ff; color: #666; border-radius: .75em .75em 0 0; padding: 0 .5em; margin: -.25em 0; } - -@media (prefers-color-scheme: dark) { #source p input:checked ~ .r label.ctx { background: #056; } } - -@media (prefers-color-scheme: dark) { #source p input:checked ~ .r label.ctx { color: #aaa; } } - -#source p input:checked ~ .r label.ctx::before { content: "▼ "; } - -#source p input:checked ~ .ctxs { padding: .25em .5em; overflow-y: scroll; max-height: 10.5em; } - -#source p label.ctx { color: #999; display: inline-block; padding: 0 .5em; font-size: .8333em; } - -@media (prefers-color-scheme: dark) { #source p label.ctx { color: #777; } } - -#source p .ctxs { display: block; max-height: 0; overflow-y: hidden; transition: all .2s; padding: 0 .5em; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Ubuntu, Cantarell, "Helvetica Neue", sans-serif; white-space: nowrap; background: #d0e8ff; border-radius: .25em; margin-right: 1.75em; text-align: right; } - -@media (prefers-color-scheme: dark) { #source p .ctxs { background: #056; } } - -#index { font-family: SFMono-Regular, Menlo, Monaco, Consolas, monospace; font-size: 0.875em; } - -#index table.index { margin-left: -.5em; } - -#index td, #index th { text-align: right; padding: .25em .5em; border-bottom: 1px solid #eee; } - -@media (prefers-color-scheme: dark) { #index td, #index th { border-color: #333; } } - -#index td.name, #index th.name { text-align: left; width: auto; font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Ubuntu, Cantarell, "Helvetica Neue", sans-serif; min-width: 15em; } - -#index th { font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Ubuntu, Cantarell, "Helvetica Neue", sans-serif; font-style: italic; color: #333; cursor: pointer; } - -@media (prefers-color-scheme: dark) { #index th { color: #ddd; } } - -#index th:hover { background: #eee; } - -@media (prefers-color-scheme: dark) { #index th:hover { background: #333; } } - -#index th .arrows { color: #666; font-size: 85%; font-family: sans-serif; font-style: normal; pointer-events: none; } - -#index th[aria-sort="ascending"], #index th[aria-sort="descending"] { white-space: nowrap; background: #eee; padding-left: .5em; } - -@media (prefers-color-scheme: dark) { #index th[aria-sort="ascending"], #index th[aria-sort="descending"] { background: #333; } } - -#index th[aria-sort="ascending"] .arrows::after { content: " ▲"; } - -#index th[aria-sort="descending"] .arrows::after { content: " ▼"; } - -#index td.name { font-size: 1.15em; } - -#index td.name a { text-decoration: none; color: inherit; } - -#index td.name .no-noun { font-style: italic; } - -#index tr.total td, #index tr.total_dynamic td { font-weight: bold; border-top: 1px solid #ccc; border-bottom: none; } - -#index tr.region:hover { background: #eee; } - -@media (prefers-color-scheme: dark) { #index tr.region:hover { background: #333; } } - -#index tr.region:hover td.name { text-decoration: underline; color: inherit; } - -#scroll_marker { position: fixed; z-index: 3; right: 0; top: 0; width: 16px; height: 100%; background: #fff; border-left: 1px solid #eee; will-change: transform; } - -@media (prefers-color-scheme: dark) { #scroll_marker { background: #1e1e1e; } } - -@media (prefers-color-scheme: dark) { #scroll_marker { border-color: #333; } } - -#scroll_marker .marker { background: #ccc; position: absolute; min-height: 3px; width: 100%; } - -@media (prefers-color-scheme: dark) { #scroll_marker .marker { background: #444; } } diff --git a/2.2.2/tests/z_4cdda0aa429327c0_enums_py.html b/2.2.2/tests/z_4cdda0aa429327c0_enums_py.html deleted file mode 100644 index 9d167b5..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_enums_py.html +++ /dev/null @@ -1,137 +0,0 @@ - - - - - Coverage for src/rok4/enums.py: 100% - - - - - -
-
-

- Coverage for src/rok4/enums.py: - 100% -

- -

- 17 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1#! python3 # noqa: E265 

-

2 

-

3# standard lib 

-

4from enum import Enum 

-

5 

-

6 

-

7class PyramidType(Enum): 

-

8 """Pyramid's data type""" 

-

9 

-

10 RASTER = "RASTER" 

-

11 VECTOR = "VECTOR" 

-

12 

-

13 

-

14class SlabType(Enum): 

-

15 """Slab's type""" 

-

16 

-

17 DATA = "DATA" # Slab of data, raster or vector 

-

18 MASK = "MASK" # Slab of mask, only for raster pyramid, image with one band : 0 is nodata, other values are data 

-

19 

-

20 

-

21class StorageType(Enum): 

-

22 """Storage type and path's protocol""" 

-

23 

-

24 CEPH = "ceph://" 

-

25 FILE = "file://" 

-

26 HTTP = "http://" 

-

27 HTTPS = "https://" 

-

28 S3 = "s3://" 

-

29 

-

30 

-

31class ColorFormat(Enum): 

-

32 """A color format enumeration. 

-

33 Except from "BIT", the member's name matches 

-

34 a common variable format name. The member's value is 

-

35 the allocated bit size associated to this format. 

-

36 """ 

-

37 

-

38 BIT = 1 

-

39 UINT8 = 8 

-

40 FLOAT32 = 32 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_exceptions_py.html b/2.2.2/tests/z_4cdda0aa429327c0_exceptions_py.html deleted file mode 100644 index 87c04d6..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_exceptions_py.html +++ /dev/null @@ -1,139 +0,0 @@ - - - - - Coverage for src/rok4/exceptions.py: 100% - - - - - -
-
-

- Coverage for src/rok4/exceptions.py: - 100% -

- -

- 20 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1class MissingAttributeError(Exception): 

-

2 """ 

-

3 Exception raised when an attribute is missing in a file 

-

4 """ 

-

5 

-

6 def __init__(self, path, missing): 

-

7 self.path = path 

-

8 self.missing = missing 

-

9 super().__init__(f"Missing attribute {missing} in '{path}'") 

-

10 

-

11 

-

12class MissingEnvironmentError(Exception): 

-

13 """ 

-

14 Exception raised when a needed environment variable is not defined 

-

15 """ 

-

16 

-

17 def __init__(self, missing): 

-

18 self.missing = missing 

-

19 super().__init__(f"Missing environment variable {missing}") 

-

20 

-

21 

-

22class StorageError(Exception): 

-

23 """ 

-

24 Exception raised when an issue occured when using a storage 

-

25 """ 

-

26 

-

27 def __init__(self, type, issue): 

-

28 self.type = type 

-

29 self.issue = issue 

-

30 super().__init__(f"Issue occured using a {type} storage : {issue}") 

-

31 

-

32 

-

33class FormatError(Exception): 

-

34 """ 

-

35 Exception raised when a format is expected but not respected 

-

36 """ 

-

37 

-

38 def __init__(self, expected_format, content, issue): 

-

39 self.expected_format = expected_format 

-

40 self.content = content 

-

41 self.issue = issue 

-

42 super().__init__(f"Expected format {expected_format} to read '{content}' : {issue}") 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_layer_py.html b/2.2.2/tests/z_4cdda0aa429327c0_layer_py.html deleted file mode 100644 index 35a07ef..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_layer_py.html +++ /dev/null @@ -1,409 +0,0 @@ - - - - - Coverage for src/rok4/layer.py: 85% - - - - - -
-
-

- Coverage for src/rok4/layer.py: - 85% -

- -

- 132 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1"""Provide classes to use a layer. 

-

2 

-

3The module contains the following classe: 

-

4 

-

5- `Layer` - Descriptor to broadcast pyramids' data 

-

6""" 

-

7 

-

8# -- IMPORTS -- 

-

9 

-

10# standard library 

-

11import json 

-

12import os 

-

13import re 

-

14from json.decoder import JSONDecodeError 

-

15from typing import Dict, List, Tuple 

-

16 

-

17# package 

-

18from rok4.enums import PyramidType 

-

19from rok4.exceptions import FormatError, MissingAttributeError 

-

20from rok4.pyramid import Pyramid 

-

21from rok4.storage import get_data_str, get_infos_from_path, put_data_str 

-

22from rok4.utils import reproject_bbox 

-

23 

-

24 

-

25class Layer: 

-

26 """A data layer, raster or vector 

-

27 

-

28 Attributes: 

-

29 __name (str): layer's technical name 

-

30 __pyramids (Dict[str, Union[rok4.pyramid.Pyramid,str,str]]): used pyramids 

-

31 __format (str): pyramid's list path 

-

32 __tms (rok4.tile_matrix_set.TileMatrixSet): Used grid 

-

33 __keywords (List[str]): Keywords 

-

34 __levels (Dict[str, rok4.pyramid.Level]): Used pyramids' levels 

-

35 __best_level (rok4.pyramid.Level): Used pyramids best level 

-

36 __resampling (str): Interpolation to use fot resampling 

-

37 __bbox (Tuple[float, float, float, float]): data bounding box, TMS coordinates system 

-

38 __geobbox (Tuple[float, float, float, float]): data bounding box, EPSG:4326 

-

39 """ 

-

40 

-

41 @classmethod 

-

42 def from_descriptor(cls, descriptor: str) -> "Layer": 

-

43 """Create a layer from its descriptor 

-

44 

-

45 Args: 

-

46 descriptor (str): layer's descriptor path 

-

47 

-

48 Raises: 

-

49 FormatError: Provided path is not a well formed JSON 

-

50 MissingAttributeError: Attribute is missing in the content 

-

51 StorageError: Storage read issue (layer descriptor) 

-

52 MissingEnvironmentError: Missing object storage informations 

-

53 

-

54 Returns: 

-

55 Layer: a Layer instance 

-

56 """ 

-

57 try: 

-

58 data = json.loads(get_data_str(descriptor)) 

-

59 

-

60 except JSONDecodeError as e: 

-

61 raise FormatError("JSON", descriptor, e) 

-

62 

-

63 layer = cls() 

-

64 

-

65 storage_type, path, root, base_name = get_infos_from_path(descriptor) 

-

66 layer.__name = base_name[:-5] # on supprime l'extension.json 

-

67 

-

68 try: 

-

69 # Attributs communs 

-

70 layer.__title = data["title"] 

-

71 layer.__abstract = data["abstract"] 

-

72 layer.__load_pyramids(data["pyramids"]) 

-

73 

-

74 # Paramètres optionnels 

-

75 if "keywords" in data: 

-

76 for k in data["keywords"]: 

-

77 layer.__keywords.append(k) 

-

78 

-

79 if layer.type == PyramidType.RASTER: 

-

80 if "resampling" in data: 

-

81 layer.__resampling = data["resampling"] 

-

82 

-

83 if "styles" in data: 

-

84 layer.__styles = data["styles"] 

-

85 else: 

-

86 layer.__styles = ["normal"] 

-

87 

-

88 # Les bbox, native et géographique 

-

89 if "bbox" in data: 

-

90 layer.__geobbox = ( 

-

91 data["bbox"]["south"], 

-

92 data["bbox"]["west"], 

-

93 data["bbox"]["north"], 

-

94 data["bbox"]["east"], 

-

95 ) 

-

96 layer.__bbox = reproject_bbox(layer.__geobbox, "EPSG:4326", layer.__tms.srs, 5) 

-

97 # On force l'emprise de la couche, on recalcule donc les tuiles limites correspondantes pour chaque niveau 

-

98 for level in layer.__levels.values(): 

-

99 level.set_limits_from_bbox(layer.__bbox) 

-

100 else: 

-

101 layer.__bbox = layer.__best_level.bbox 

-

102 layer.__geobbox = reproject_bbox(layer.__bbox, layer.__tms.srs, "EPSG:4326", 5) 

-

103 

-

104 except KeyError as e: 

-

105 raise MissingAttributeError(descriptor, e) 

-

106 

-

107 return layer 

-

108 

-

109 @classmethod 

-

110 def from_parameters(cls, pyramids: List[Dict[str, str]], name: str, **kwargs) -> "Layer": 

-

111 """Create a default layer from parameters 

-

112 

-

113 Args: 

-

114 pyramids (List[Dict[str, str]]): pyramids to use and extrem levels, bottom and top 

-

115 name (str): layer's technical name 

-

116 **title (str): Layer's title (will be equal to name if not provided) 

-

117 **abstract (str): Layer's abstract (will be equal to name if not provided) 

-

118 **styles (List[str]): Styles identifier to authorized for the layer 

-

119 **resampling (str): Interpolation to use for resampling 

-

120 

-

121 Raises: 

-

122 Exception: name contains forbidden characters or used pyramids do not shared same parameters (format, tms...) 

-

123 

-

124 Returns: 

-

125 Layer: a Layer instance 

-

126 """ 

-

127 

-

128 layer = cls() 

-

129 

-

130 # Informations obligatoires 

-

131 if not re.match("^[A-Za-z0-9_-]*$", name): 

-

132 raise Exception( 

-

133 f"Layer's name have to contain only letters, number, hyphen and underscore, to be URL and storage compliant ({name})" 

-

134 ) 

-

135 

-

136 layer.__name = name 

-

137 layer.__load_pyramids(pyramids) 

-

138 

-

139 # Les bbox, native et géographique 

-

140 layer.__bbox = layer.__best_level.bbox 

-

141 layer.__geobbox = reproject_bbox(layer.__bbox, layer.__tms.srs, "EPSG:4326", 5) 

-

142 

-

143 # Informations calculées 

-

144 layer.__keywords.append(layer.type.name) 

-

145 layer.__keywords.append(layer.__name) 

-

146 

-

147 # Informations optionnelles 

-

148 if "title" in kwargs and kwargs["title"] is not None: 

-

149 layer.__title = kwargs["title"] 

-

150 else: 

-

151 layer.__title = name 

-

152 

-

153 if "abstract" in kwargs and kwargs["abstract"] is not None: 

-

154 layer.__abstract = kwargs["abstract"] 

-

155 else: 

-

156 layer.__abstract = name 

-

157 

-

158 if layer.type == PyramidType.RASTER: 

-

159 if "styles" in kwargs and kwargs["styles"] is not None and len(kwargs["styles"]) > 0: 

-

160 layer.__styles = kwargs["styles"] 

-

161 else: 

-

162 layer.__styles = ["normal"] 

-

163 

-

164 if "resampling" in kwargs and kwargs["resampling"] is not None: 

-

165 layer.__resampling = kwargs["resampling"] 

-

166 

-

167 return layer 

-

168 

-

169 def __init__(self) -> None: 

-

170 self.__format = None 

-

171 self.__tms = None 

-

172 self.__best_level = None 

-

173 self.__levels = {} 

-

174 self.__keywords = [] 

-

175 self.__pyramids = [] 

-

176 

-

177 def __load_pyramids(self, pyramids: List[Dict[str, str]]) -> None: 

-

178 """Load and check pyramids 

-

179 

-

180 Args: 

-

181 pyramids (List[Dict[str, str]]): List of descriptors' paths and optionnaly top and bottom levels 

-

182 

-

183 Raises: 

-

184 Exception: Pyramids' do not all own the same format 

-

185 Exception: Pyramids' do not all own the same TMS 

-

186 Exception: Pyramids' do not all own the same channels number 

-

187 Exception: Overlapping in usage pyramids' levels 

-

188 """ 

-

189 

-

190 # Toutes les pyramides doivent avoir les même caractéristiques 

-

191 channels = None 

-

192 for p in pyramids: 

-

193 pyramid = Pyramid.from_descriptor(p["path"]) 

-

194 bottom_level = p.get("bottom_level", None) 

-

195 top_level = p.get("top_level", None) 

-

196 

-

197 if bottom_level is None: 

-

198 bottom_level = pyramid.bottom_level.id 

-

199 

-

200 if top_level is None: 

-

201 top_level = pyramid.top_level.id 

-

202 

-

203 if self.__format is not None and self.__format != pyramid.format: 

-

204 raise Exception( 

-

205 f"Used pyramids have to own the same format : {self.__format} != {pyramid.format}" 

-

206 ) 

-

207 else: 

-

208 self.__format = pyramid.format 

-

209 

-

210 if self.__tms is not None and self.__tms.id != pyramid.tms.id: 

-

211 raise Exception( 

-

212 f"Used pyramids have to use the same TMS : {self.__tms.id} != {pyramid.tms.id}" 

-

213 ) 

-

214 else: 

-

215 self.__tms = pyramid.tms 

-

216 

-

217 if self.type == PyramidType.RASTER: 

-

218 if channels is not None and channels != pyramid.raster_specifications["channels"]: 

-

219 raise Exception( 

-

220 f"Used RASTER pyramids have to own the same number of channels : {channels} != {pyramid.raster_specifications['channels']}" 

-

221 ) 

-

222 else: 

-

223 channels = pyramid.raster_specifications["channels"] 

-

224 self.__resampling = pyramid.raster_specifications["interpolation"] 

-

225 

-

226 levels = pyramid.get_levels(bottom_level, top_level) 

-

227 for level in levels: 

-

228 if level.id in self.__levels: 

-

229 raise Exception(f"Level {level.id} is present in two used pyramids") 

-

230 self.__levels[level.id] = level 

-

231 

-

232 self.__pyramids.append( 

-

233 {"pyramid": pyramid, "bottom_level": bottom_level, "top_level": top_level} 

-

234 ) 

-

235 

-

236 self.__best_level = sorted(self.__levels.values(), key=lambda level: level.resolution)[0] 

-

237 

-

238 def __str__(self) -> str: 

-

239 return f"{self.type.name} layer '{self.__name}'" 

-

240 

-

241 @property 

-

242 def serializable(self) -> Dict: 

-

243 """Get the dict version of the layer object, descriptor compliant 

-

244 

-

245 Returns: 

-

246 Dict: descriptor structured object description 

-

247 """ 

-

248 serialization = { 

-

249 "title": self.__title, 

-

250 "abstract": self.__abstract, 

-

251 "keywords": self.__keywords, 

-

252 "wmts": {"authorized": True}, 

-

253 "tms": {"authorized": True}, 

-

254 "bbox": { 

-

255 "south": self.__geobbox[0], 

-

256 "west": self.__geobbox[1], 

-

257 "north": self.__geobbox[2], 

-

258 "east": self.__geobbox[3], 

-

259 }, 

-

260 "pyramids": [], 

-

261 } 

-

262 

-

263 for p in self.__pyramids: 

-

264 serialization["pyramids"].append( 

-

265 { 

-

266 "bottom_level": p["bottom_level"], 

-

267 "top_level": p["top_level"], 

-

268 "path": p["pyramid"].descriptor, 

-

269 } 

-

270 ) 

-

271 

-

272 if self.type == PyramidType.RASTER: 

-

273 serialization["wms"] = { 

-

274 "authorized": True, 

-

275 "crs": ["CRS:84", "IGNF:WGS84G", "EPSG:3857", "EPSG:4258", "EPSG:4326"], 

-

276 } 

-

277 

-

278 if self.__tms.srs.upper() not in serialization["wms"]["crs"]: 

-

279 serialization["wms"]["crs"].append(self.__tms.srs.upper()) 

-

280 

-

281 serialization["styles"] = self.__styles 

-

282 serialization["resampling"] = self.__resampling 

-

283 

-

284 return serialization 

-

285 

-

286 def write_descriptor(self, directory: str = None) -> None: 

-

287 """Print layer's descriptor as JSON 

-

288 

-

289 Args: 

-

290 directory (str, optional): Directory (file or object) where to print the layer's descriptor, called <layer's name>.json. Defaults to None, JSON is printed to standard output. 

-

291 """ 

-

292 content = json.dumps(self.serializable) 

-

293 

-

294 if directory is None: 

-

295 print(content) 

-

296 else: 

-

297 put_data_str(content, os.path.join(directory, f"{self.__name}.json")) 

-

298 

-

299 @property 

-

300 def type(self) -> PyramidType: 

-

301 if self.__format == "TIFF_PBF_MVT": 

-

302 return PyramidType.VECTOR 

-

303 else: 

-

304 return PyramidType.RASTER 

-

305 

-

306 @property 

-

307 def bbox(self) -> Tuple[float, float, float, float]: 

-

308 return self.__bbox 

-

309 

-

310 @property 

-

311 def geobbox(self) -> Tuple[float, float, float, float]: 

-

312 return self.__geobbox 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_pyramid_py.html b/2.2.2/tests/z_4cdda0aa429327c0_pyramid_py.html deleted file mode 100644 index be773e6..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_pyramid_py.html +++ /dev/null @@ -1,1578 +0,0 @@ - - - - - Coverage for src/rok4/pyramid.py: 75% - - - - - -
-
-

- Coverage for src/rok4/pyramid.py: - 75% -

- -

- 497 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1"""Provide classes to use pyramid's data. 

-

2 

-

3The module contains the following classes: 

-

4 

-

5- `Pyramid` - Data container 

-

6- `Level` - Level of a pyramid 

-

7""" 

-

8 

-

9# -- IMPORTS -- 

-

10 

-

11# standard library 

-

12import io 

-

13import json 

-

14import os 

-

15import re 

-

16import tempfile 

-

17import zlib 

-

18from json.decoder import JSONDecodeError 

-

19from typing import Dict, Iterator, List, Tuple 

-

20 

-

21# 3rd party 

-

22import mapbox_vector_tile 

-

23import numpy 

-

24from PIL import Image 

-

25 

-

26# package 

-

27from rok4.enums import PyramidType, SlabType, StorageType 

-

28from rok4.exceptions import FormatError, MissingAttributeError 

-

29from rok4.storage import ( 

-

30 copy, 

-

31 get_data_binary, 

-

32 get_data_str, 

-

33 get_infos_from_path, 

-

34 get_path_from_infos, 

-

35 put_data_str, 

-

36 remove, 

-

37 size_path, 

-

38) 

-

39from rok4.tile_matrix_set import TileMatrix, TileMatrixSet 

-

40from rok4.utils import reproject_point, srs_to_spatialreference 

-

41 

-

42# -- GLOBALS -- 

-

43ROK4_IMAGE_HEADER_SIZE = 2048 

-

44"""Slab's header size, 2048 bytes""" 

-

45 

-

46 

-

47def b36_number_encode(number: int) -> str: 

-

48 """Convert base-10 number to base-36 

-

49 

-

50 Used alphabet is '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ' 

-

51 

-

52 Args: 

-

53 number (int): base-10 number 

-

54 

-

55 Returns: 

-

56 str: base-36 number 

-

57 """ 

-

58 

-

59 alphabet = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ" 

-

60 

-

61 base36 = "" 

-

62 

-

63 if 0 <= number < len(alphabet): 

-

64 return alphabet[number] 

-

65 

-

66 while number != 0: 

-

67 number, i = divmod(number, len(alphabet)) 

-

68 base36 = alphabet[i] + base36 

-

69 

-

70 return base36 

-

71 

-

72 

-

73def b36_number_decode(number: str) -> int: 

-

74 """Convert base-36 number to base-10 

-

75 

-

76 Args: 

-

77 number (str): base-36 number 

-

78 

-

79 Returns: 

-

80 int: base-10 number 

-

81 """ 

-

82 return int(number, 36) 

-

83 

-

84 

-

85def b36_path_decode(path: str) -> Tuple[int, int]: 

-

86 """Get slab's column and row from a base-36 based path 

-

87 

-

88 Args: 

-

89 path (str): slab's path 

-

90 

-

91 Returns: 

-

92 Tuple[int, int]: slab's column and row 

-

93 """ 

-

94 

-

95 path = path.replace("/", "") 

-

96 path = re.sub(r"(\.TIFF?)", "", path.upper()) 

-

97 

-

98 b36_column = "" 

-

99 b36_row = "" 

-

100 

-

101 while len(path) > 0: 

-

102 b36_column += path[0] 

-

103 b36_row += path[1] 

-

104 path = path[2:] 

-

105 

-

106 return b36_number_decode(b36_column), b36_number_decode(b36_row) 

-

107 

-

108 

-

109def b36_path_encode(column: int, row: int, slashs: int) -> str: 

-

110 """Convert slab indices to base-36 based path, with .tif extension 

-

111 

-

112 Args: 

-

113 column (int): slab's column 

-

114 row (int): slab's row 

-

115 slashs (int): slashs' number (to split path) 

-

116 

-

117 Returns: 

-

118 str: base-36 based path 

-

119 """ 

-

120 

-

121 b36_column = b36_number_encode(column) 

-

122 b36_row = b36_number_encode(row) 

-

123 

-

124 max_len = max(slashs + 1, len(b36_column), len(b36_row)) 

-

125 

-

126 b36_column = b36_column.rjust(max_len, "0") 

-

127 b36_row = b36_row.rjust(max_len, "0") 

-

128 

-

129 b36_path = "" 

-

130 

-

131 while len(b36_column) > 0: 

-

132 b36_path = b36_row[-1] + b36_path 

-

133 b36_path = b36_column[-1] + b36_path 

-

134 

-

135 b36_column = b36_column[:-1] 

-

136 b36_row = b36_row[:-1] 

-

137 

-

138 if slashs > 0: 

-

139 b36_path = "/" + b36_path 

-

140 slashs -= 1 

-

141 

-

142 return f"{b36_path}.tif" 

-

143 

-

144 

-

145class Level: 

-

146 """A pyramid's level, raster or vector 

-

147 

-

148 Attributes: 

-

149 __id (str): level's identifier. have to exist in the pyramid's used TMS 

-

150 __tile_limits (Dict[str, int]): minimum and maximum tiles' columns and rows of pyramid's content 

-

151 __slab_size (Tuple[int, int]): number of tile in a slab, widthwise and heightwise 

-

152 __tables (List[Dict]): for a VECTOR pyramid, description of vector content, tables and attributes 

-

153 """ 

-

154 

-

155 @classmethod 

-

156 def from_descriptor(cls, data: Dict, pyramid: "Pyramid") -> "Level": 

-

157 """Create a pyramid's level from the pyramid's descriptor levels element 

-

158 

-

159 Args: 

-

160 data (Dict): level's information from the pyramid's descriptor 

-

161 pyramid (Pyramid): pyramid containing the level to create 

-

162 

-

163 Raises: 

-

164 Exception: different storage or masks presence between the level and the pyramid 

-

165 MissingAttributeError: Attribute is missing in the content 

-

166 

-

167 Returns: 

-

168 Pyramid: a Level instance 

-

169 """ 

-

170 level = cls() 

-

171 

-

172 level.__pyramid = pyramid 

-

173 

-

174 # Attributs communs 

-

175 try: 

-

176 level.__id = data["id"] 

-

177 level.__tile_limits = data["tile_limits"] 

-

178 level.__slab_size = ( 

-

179 data["tiles_per_width"], 

-

180 data["tiles_per_height"], 

-

181 ) 

-

182 

-

183 # Informations sur le stockage : on les valide et stocke dans la pyramide 

-

184 if pyramid.storage_type.name != data["storage"]["type"]: 

-

185 raise Exception( 

-

186 f"Pyramid {pyramid.descriptor} owns levels using different storage types ({ data['storage']['type'] }) than its one ({pyramid.storage_type.name})" 

-

187 ) 

-

188 

-

189 if pyramid.storage_type == StorageType.FILE: 

-

190 pyramid.storage_depth = data["storage"]["path_depth"] 

-

191 

-

192 if "mask_directory" in data["storage"] or "mask_prefix" in data["storage"]: 

-

193 if not pyramid.own_masks: 

-

194 raise Exception( 

-

195 f"Pyramid {pyramid.__descriptor} does not define a mask format but level {level.__id} define mask storage informations" 

-

196 ) 

-

197 else: 

-

198 if pyramid.own_masks: 

-

199 raise Exception( 

-

200 f"Pyramid {pyramid.__descriptor} define a mask format but level {level.__id} does not define mask storage informations" 

-

201 ) 

-

202 

-

203 except KeyError as e: 

-

204 raise MissingAttributeError(pyramid.descriptor, f"levels[].{e}") 

-

205 

-

206 # Attributs dans le cas d'un niveau vecteur 

-

207 if level.__pyramid.type == PyramidType.VECTOR: 

-

208 try: 

-

209 level.__tables = data["tables"] 

-

210 

-

211 except KeyError as e: 

-

212 raise MissingAttributeError(pyramid.descriptor, f"levels[].{e}") 

-

213 

-

214 return level 

-

215 

-

216 @classmethod 

-

217 def from_other(cls, other: "Level", pyramid: "Pyramid") -> "Level": 

-

218 """Create a pyramid's level from another one 

-

219 

-

220 Args: 

-

221 other (Level): level to clone 

-

222 pyramid (Pyramid): new pyramid containing the new level 

-

223 

-

224 Raises: 

-

225 Exception: different storage or masks presence between the level and the pyramid 

-

226 MissingAttributeError: Attribute is missing in the content 

-

227 

-

228 Returns: 

-

229 Pyramid: a Level instance 

-

230 """ 

-

231 

-

232 level = cls() 

-

233 

-

234 # Attributs communs 

-

235 level.__id = other.__id 

-

236 level.__pyramid = pyramid 

-

237 level.__tile_limits = other.__tile_limits 

-

238 level.__slab_size = other.__slab_size 

-

239 

-

240 # Attributs dans le cas d'un niveau vecteur 

-

241 if level.__pyramid.type == PyramidType.VECTOR: 

-

242 level.__tables = other.__tables 

-

243 

-

244 return level 

-

245 

-

246 def __str__(self) -> str: 

-

247 return f"{self.__pyramid.type.name} pyramid's level '{self.__id}' ({self.__pyramid.storage_type.name} storage)" 

-

248 

-

249 @property 

-

250 def serializable(self) -> Dict: 

-

251 """Get the dict version of the pyramid object, pyramid's descriptor compliant 

-

252 

-

253 Returns: 

-

254 Dict: pyramid's descriptor structured object description 

-

255 """ 

-

256 serialization = { 

-

257 "id": self.__id, 

-

258 "tiles_per_width": self.__slab_size[0], 

-

259 "tiles_per_height": self.__slab_size[1], 

-

260 "tile_limits": self.__tile_limits, 

-

261 } 

-

262 

-

263 if self.__pyramid.type == PyramidType.VECTOR: 

-

264 serialization["tables"] = self.__tables 

-

265 

-

266 if self.__pyramid.storage_type == StorageType.FILE: 

-

267 serialization["storage"] = { 

-

268 "type": "FILE", 

-

269 "image_directory": f"{self.__pyramid.name}/DATA/{self.__id}", 

-

270 "path_depth": self.__pyramid.storage_depth, 

-

271 } 

-

272 if self.__pyramid.own_masks: 

-

273 serialization["storage"][ 

-

274 "mask_directory" 

-

275 ] = f"{self.__pyramid.name}/MASK/{self.__id}" 

-

276 

-

277 elif self.__pyramid.storage_type == StorageType.CEPH: 

-

278 serialization["storage"] = { 

-

279 "type": "CEPH", 

-

280 "image_prefix": f"{self.__pyramid.name}/DATA_{self.__id}", 

-

281 "pool_name": self.__pyramid.storage_root, 

-

282 } 

-

283 if self.__pyramid.own_masks: 

-

284 serialization["storage"]["mask_prefix"] = f"{self.__pyramid.name}/MASK_{self.__id}" 

-

285 

-

286 elif self.__pyramid.storage_type == StorageType.S3: 

-

287 serialization["storage"] = { 

-

288 "type": "S3", 

-

289 "image_prefix": f"{self.__pyramid.name}/DATA_{self.__id}", 

-

290 "bucket_name": self.__pyramid.storage_root, 

-

291 } 

-

292 if self.__pyramid.own_masks: 

-

293 serialization["storage"]["mask_prefix"] = f"{self.__pyramid.name}/MASK_{self.__id}" 

-

294 

-

295 return serialization 

-

296 

-

297 @property 

-

298 def id(self) -> str: 

-

299 return self.__id 

-

300 

-

301 @property 

-

302 def bbox(self) -> Tuple[float, float, float, float]: 

-

303 """Return level extent, based on tile limits 

-

304 

-

305 Returns: 

-

306 Tuple[float, float, float, float]: level terrain extent (xmin, ymin, xmax, ymax) 

-

307 """ 

-

308 

-

309 min_bbox = self.__pyramid.tms.get_level(self.__id).tile_to_bbox( 

-

310 self.__tile_limits["min_col"], self.__tile_limits["max_row"] 

-

311 ) 

-

312 max_bbox = self.__pyramid.tms.get_level(self.__id).tile_to_bbox( 

-

313 self.__tile_limits["max_col"], self.__tile_limits["min_row"] 

-

314 ) 

-

315 

-

316 return (min_bbox[0], min_bbox[1], max_bbox[2], max_bbox[3]) 

-

317 

-

318 @property 

-

319 def resolution(self) -> str: 

-

320 return self.__pyramid.tms.get_level(self.__id).resolution 

-

321 

-

322 @property 

-

323 def tile_matrix(self) -> TileMatrix: 

-

324 return self.__pyramid.tms.get_level(self.__id) 

-

325 

-

326 @property 

-

327 def slab_width(self) -> int: 

-

328 return self.__slab_size[0] 

-

329 

-

330 @property 

-

331 def slab_height(self) -> int: 

-

332 return self.__slab_size[1] 

-

333 

-

334 @property 

-

335 def tile_limits(self) -> Dict[str, int]: 

-

336 return self.__tile_limits 

-

337 

-

338 def is_in_limits(self, column: int, row: int) -> bool: 

-

339 """Is the tile indices in limits ? 

-

340 

-

341 Args: 

-

342 column (int): tile's column 

-

343 row (int): tile's row 

-

344 

-

345 Returns: 

-

346 bool: True if tiles' limits contain the provided tile's indices 

-

347 """ 

-

348 return ( 

-

349 self.__tile_limits["min_row"] <= row 

-

350 and self.__tile_limits["max_row"] >= row 

-

351 and self.__tile_limits["min_col"] <= column 

-

352 and self.__tile_limits["max_col"] >= column 

-

353 ) 

-

354 

-

355 def set_limits_from_bbox(self, bbox: Tuple[float, float, float, float]) -> None: 

-

356 """Set tile limits, based on provided bounding box 

-

357 

-

358 Args: 

-

359 bbox (Tuple[float, float, float, float]): terrain extent (xmin, ymin, xmax, ymax), in TMS coordinates system 

-

360 

-

361 """ 

-

362 

-

363 col_min, row_min, col_max, row_max = self.__pyramid.tms.get_level(self.__id).bbox_to_tiles( 

-

364 bbox 

-

365 ) 

-

366 self.__tile_limits = { 

-

367 "min_row": row_min, 

-

368 "max_col": col_max, 

-

369 "max_row": row_max, 

-

370 "min_col": col_min, 

-

371 } 

-

372 

-

373 

-

374class Pyramid: 

-

375 """A data pyramid, raster or vector 

-

376 

-

377 Attributes: 

-

378 __name (str): pyramid's name 

-

379 __descriptor (str): pyramid's descriptor path 

-

380 __list (str): pyramid's list path 

-

381 __tms (rok4.tile_matrix_set.TileMatrixSet): Used grid 

-

382 __levels (Dict[str, Level]): Pyramid's levels 

-

383 __format (str): Data format 

-

384 __storage (Dict[str, Union[rok4.enums.StorageType,str,int]]): Pyramid's storage informations (type, root and depth if FILE storage) 

-

385 __raster_specifications (Dict): If raster pyramid, raster specifications 

-

386 __content (Dict): Loading status (loaded), slab count (count) and list content (cache). 

-

387 

-

388 Example (S3 storage): 

-

389 

-

390 { 

-

391 'cache': { 

-

392 (<SlabType.DATA: 'DATA'>, '18', 5424, 7526): { 

-

393 'link': False, 

-

394 'md5': None, 

-

395 'root': 'pyramids@localhost:9000/LIMADM', 

-

396 'slab': 'DATA_18_5424_7526' 

-

397 } 

-

398 }, 

-

399 'count': 1, 

-

400 'loaded': True 

-

401 } 

-

402 """ 

-

403 

-

404 @classmethod 

-

405 def from_descriptor(cls, descriptor: str) -> "Pyramid": 

-

406 """Create a pyramid from its descriptor 

-

407 

-

408 Args: 

-

409 descriptor (str): pyramid's descriptor path 

-

410 

-

411 Raises: 

-

412 FormatError: Provided path or the descriptor is not a well formed JSON 

-

413 Exception: Level issue : no one in the pyramid or the used TMS, or level ID not defined in the TMS 

-

414 MissingAttributeError: Attribute is missing in the content 

-

415 StorageError: Storage read issue (pyramid descriptor or TMS) 

-

416 MissingEnvironmentError: Missing object storage informations or TMS root directory 

-

417 

-

418 Examples: 

-

419 

-

420 S3 stored descriptor 

-

421 

-

422 from rok4.pyramid import Pyramid 

-

423 

-

424 try: 

-

425 pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json") 

-

426 except Exception as e: 

-

427 print("Cannot load the pyramid from its descriptor") 

-

428 

-

429 Returns: 

-

430 Pyramid: a Pyramid instance 

-

431 """ 

-

432 try: 

-

433 data = json.loads(get_data_str(descriptor)) 

-

434 

-

435 except JSONDecodeError as e: 

-

436 raise FormatError("JSON", descriptor, e) 

-

437 

-

438 pyramid = cls() 

-

439 

-

440 pyramid.__storage["type"], path, pyramid.__storage["root"], base_name = get_infos_from_path( 

-

441 descriptor 

-

442 ) 

-

443 pyramid.__name = base_name[:-5] # on supprime l'extension.json 

-

444 pyramid.__descriptor = descriptor 

-

445 pyramid.__list = get_path_from_infos( 

-

446 pyramid.__storage["type"], pyramid.__storage["root"], f"{pyramid.__name}.list" 

-

447 ) 

-

448 

-

449 try: 

-

450 # Attributs communs 

-

451 pyramid.__tms = TileMatrixSet(data["tile_matrix_set"]) 

-

452 pyramid.__format = data["format"] 

-

453 

-

454 # Attributs d'une pyramide raster 

-

455 if pyramid.type == PyramidType.RASTER: 

-

456 pyramid.__raster_specifications = data["raster_specifications"] 

-

457 

-

458 if "mask_format" in data: 

-

459 pyramid.__masks = True 

-

460 else: 

-

461 pyramid.__masks = False 

-

462 

-

463 # Niveaux 

-

464 for level in data["levels"]: 

-

465 lev = Level.from_descriptor(level, pyramid) 

-

466 pyramid.__levels[lev.id] = lev 

-

467 

-

468 if pyramid.__tms.get_level(lev.id) is None: 

-

469 raise Exception( 

-

470 f"Pyramid {descriptor} owns a level with the ID '{lev.id}', not defined in the TMS '{pyramid.tms.name}'" 

-

471 ) 

-

472 

-

473 except KeyError as e: 

-

474 raise MissingAttributeError(descriptor, e) 

-

475 

-

476 if len(pyramid.__levels.keys()) == 0: 

-

477 raise Exception(f"Pyramid '{descriptor}' has no level") 

-

478 

-

479 return pyramid 

-

480 

-

481 @classmethod 

-

482 def from_other(cls, other: "Pyramid", name: str, storage: Dict, **kwargs) -> "Pyramid": 

-

483 """Create a pyramid from another one 

-

484 

-

485 Args: 

-

486 other (Pyramid): pyramid to clone 

-

487 name (str): new pyramid's name 

-

488 storage (Dict[str, Union[str, int]]): new pyramid's storage informations 

-

489 **mask (bool) : Presence or not of mask (only for RASTER) 

-

490 

-

491 Raises: 

-

492 FormatError: Provided path or the TMS is not a well formed JSON 

-

493 Exception: Level issue : no one in the pyramid or the used TMS, or level ID not defined in the TMS 

-

494 MissingAttributeError: Attribute is missing in the content 

-

495 

-

496 Returns: 

-

497 Pyramid: a Pyramid instance 

-

498 """ 

-

499 try: 

-

500 # On convertit le type de stockage selon l'énumération 

-

501 if type(storage["type"]) is str: 

-

502 storage["type"] = StorageType[storage["type"]] 

-

503 

-

504 if storage["type"] == StorageType.FILE and name.find("/") != -1: 

-

505 raise Exception(f"A FILE stored pyramid's name cannot contain '/' : '{name}'") 

-

506 

-

507 if storage["type"] == StorageType.FILE and "depth" not in storage: 

-

508 storage["depth"] = 2 

-

509 

-

510 pyramid = cls() 

-

511 

-

512 # Attributs communs 

-

513 pyramid.__name = name 

-

514 pyramid.__storage = storage 

-

515 pyramid.__masks = other.__masks 

-

516 

-

517 pyramid.__descriptor = get_path_from_infos( 

-

518 pyramid.__storage["type"], pyramid.__storage["root"], f"{pyramid.__name}.json" 

-

519 ) 

-

520 pyramid.__list = get_path_from_infos( 

-

521 pyramid.__storage["type"], pyramid.__storage["root"], f"{pyramid.__name}.list" 

-

522 ) 

-

523 pyramid.__tms = other.__tms 

-

524 pyramid.__format = other.__format 

-

525 

-

526 # Attributs d'une pyramide raster 

-

527 if pyramid.type == PyramidType.RASTER: 

-

528 if "mask" in kwargs: 

-

529 pyramid.__masks = kwargs["mask"] 

-

530 elif other.own_masks: 

-

531 pyramid.__masks = True 

-

532 else: 

-

533 pyramid.__masks = False 

-

534 pyramid.__raster_specifications = other.__raster_specifications 

-

535 

-

536 # Niveaux 

-

537 for level in other.__levels.values(): 

-

538 lev = Level.from_other(level, pyramid) 

-

539 pyramid.__levels[lev.id] = lev 

-

540 

-

541 except KeyError as e: 

-

542 raise MissingAttributeError(pyramid.descriptor, e) 

-

543 

-

544 return pyramid 

-

545 

-

546 def __init__(self) -> None: 

-

547 self.__storage = {} 

-

548 self.__levels = {} 

-

549 self.__masks = None 

-

550 

-

551 self.__content = {"loaded": False, "count": 0, "cache": {}} 

-

552 

-

553 def __str__(self) -> str: 

-

554 return f"{self.type.name} pyramid '{self.__name}' ({self.__storage['type'].name} storage)" 

-

555 

-

556 @property 

-

557 def serializable(self) -> Dict: 

-

558 """Get the dict version of the pyramid object, descriptor compliant 

-

559 

-

560 Returns: 

-

561 Dict: descriptor structured object description 

-

562 """ 

-

563 

-

564 serialization = {"tile_matrix_set": self.__tms.name, "format": self.__format} 

-

565 

-

566 serialization["levels"] = [] 

-

567 sorted_levels = sorted( 

-

568 self.__levels.values(), key=lambda level: level.resolution, reverse=True 

-

569 ) 

-

570 

-

571 for level in sorted_levels: 

-

572 serialization["levels"].append(level.serializable) 

-

573 

-

574 if self.type == PyramidType.RASTER: 

-

575 serialization["raster_specifications"] = self.__raster_specifications 

-

576 

-

577 if self.__masks: 

-

578 serialization["mask_format"] = "TIFF_ZIP_UINT8" 

-

579 

-

580 return serialization 

-

581 

-

582 @property 

-

583 def list(self) -> str: 

-

584 return self.__list 

-

585 

-

586 @property 

-

587 def descriptor(self) -> str: 

-

588 return self.__descriptor 

-

589 

-

590 @property 

-

591 def name(self) -> str: 

-

592 return self.__name 

-

593 

-

594 @property 

-

595 def tms(self) -> TileMatrixSet: 

-

596 return self.__tms 

-

597 

-

598 @property 

-

599 def raster_specifications(self) -> Dict: 

-

600 """Get raster specifications for a RASTER pyramid 

-

601 

-

602 Example: 

-

603 

-

604 RGB pyramid with red nodata 

-

605 

-

606 { 

-

607 "channels": 3, 

-

608 "nodata": "255,0,0", 

-

609 "photometric": "rgb", 

-

610 "interpolation": "bicubic" 

-

611 } 

-

612 

-

613 Returns: 

-

614 Dict: Raster specifications, None if VECTOR pyramid 

-

615 """ 

-

616 return self.__raster_specifications 

-

617 

-

618 @property 

-

619 def storage_type(self) -> StorageType: 

-

620 """Get the storage type 

-

621 

-

622 Returns: 

-

623 StorageType: FILE, S3 or CEPH 

-

624 """ 

-

625 return self.__storage["type"] 

-

626 

-

627 @property 

-

628 def storage_root(self) -> str: 

-

629 """Get the pyramid's storage root. 

-

630 

-

631 If storage is S3, the used cluster is removed. 

-

632 

-

633 Returns: 

-

634 str: Pyramid's storage root 

-

635 """ 

-

636 

-

637 return self.__storage["root"].split("@", 1)[ 

-

638 0 

-

639 ] # Suppression de l'éventuel hôte de spécification du cluster S3 

-

640 

-

641 @property 

-

642 def storage_depth(self) -> int: 

-

643 return self.__storage.get("depth", None) 

-

644 

-

645 @property 

-

646 def storage_s3_cluster(self) -> str: 

-

647 """Get the pyramid's storage S3 cluster (host name) 

-

648 

-

649 Returns: 

-

650 str: the host if known, None if the default one have to be used or if storage is not S3 

-

651 """ 

-

652 if self.__storage["type"] == StorageType.S3: 

-

653 try: 

-

654 return self.__storage["root"].split("@")[1] 

-

655 except IndexError: 

-

656 return None 

-

657 else: 

-

658 return None 

-

659 

-

660 @storage_depth.setter 

-

661 def storage_depth(self, d: int) -> None: 

-

662 """Set the tree depth for a FILE storage 

-

663 

-

664 Args: 

-

665 d (int): file storage depth 

-

666 

-

667 Raises: 

-

668 Exception: the depth is not equal to the already known depth 

-

669 """ 

-

670 if "depth" in self.__storage and self.__storage["depth"] != d: 

-

671 raise Exception(f"Pyramid {self.__descriptor} owns levels with different path depths") 

-

672 self.__storage["depth"] = d 

-

673 

-

674 @property 

-

675 def own_masks(self) -> bool: 

-

676 return self.__masks 

-

677 

-

678 @property 

-

679 def format(self) -> str: 

-

680 return self.__format 

-

681 

-

682 @property 

-

683 def channels(self) -> str: 

-

684 return self.raster_specifications["channels"] 

-

685 

-

686 @property 

-

687 def tile_extension(self) -> str: 

-

688 if self.__format in [ 

-

689 "TIFF_RAW_UINT8", 

-

690 "TIFF_LZW_UINT8", 

-

691 "TIFF_ZIP_UINT8", 

-

692 "TIFF_PKB_UINT8", 

-

693 "TIFF_RAW_FLOAT32", 

-

694 "TIFF_LZW_FLOAT32", 

-

695 "TIFF_ZIP_FLOAT32", 

-

696 "TIFF_PKB_FLOAT32", 

-

697 ]: 

-

698 return "tif" 

-

699 elif self.__format in ["TIFF_JPG_UINT8", "TIFF_JPG90_UINT8"]: 

-

700 return "jpg" 

-

701 elif self.__format == "TIFF_PNG_UINT8": 

-

702 return "png" 

-

703 elif self.__format == "TIFF_PBF_MVT": 

-

704 return "pbf" 

-

705 else: 

-

706 raise Exception( 

-

707 f"Unknown pyramid's format ({self.__format}), cannot return the tile extension" 

-

708 ) 

-

709 

-

710 @property 

-

711 def bottom_level(self) -> "Level": 

-

712 """Get the best resolution level in the pyramid 

-

713 

-

714 Returns: 

-

715 Level: the bottom level 

-

716 """ 

-

717 return sorted(self.__levels.values(), key=lambda level: level.resolution)[0] 

-

718 

-

719 @property 

-

720 def top_level(self) -> "Level": 

-

721 """Get the low resolution level in the pyramid 

-

722 

-

723 Returns: 

-

724 Level: the top level 

-

725 """ 

-

726 return sorted(self.__levels.values(), key=lambda level: level.resolution)[-1] 

-

727 

-

728 @property 

-

729 def type(self) -> PyramidType: 

-

730 """Get the pyramid's type (RASTER or VECTOR) from its format 

-

731 

-

732 Returns: 

-

733 PyramidType: RASTER or VECTOR 

-

734 """ 

-

735 if self.__format == "TIFF_PBF_MVT": 

-

736 return PyramidType.VECTOR 

-

737 else: 

-

738 return PyramidType.RASTER 

-

739 

-

740 def load_list(self) -> int: 

-

741 """Load list content and cache it 

-

742 

-

743 If list is already loaded, nothing done 

-

744 """ 

-

745 if self.__content["loaded"]: 

-

746 return self.__content["count"] 

-

747 

-

748 for slab, infos in self.list_generator(): 

-

749 self.__content["cache"][slab] = infos 

-

750 self.__content["count"] += 1 

-

751 

-

752 self.__content["loaded"] = True 

-

753 

-

754 return self.__content["count"] 

-

755 

-

756 def list_generator( 

-

757 self, level_id: str = None 

-

758 ) -> Iterator[Tuple[Tuple[SlabType, str, int, int], Dict]]: 

-

759 """Get list content 

-

760 

-

761 List is copied as temporary file, roots are read and informations about each slab is returned. If list is already loaded, we yield the cached content 

-

762 Args : 

-

763 level_id (str) : id of the level for load only one level 

-

764 

-

765 Examples: 

-

766 

-

767 S3 stored descriptor 

-

768 

-

769 from rok4.pyramid import Pyramid 

-

770 

-

771 try: 

-

772 pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json") 

-

773 

-

774 for (slab_type, level, column, row), infos in pyramid.list_generator(): 

-

775 print(infos) 

-

776 

-

777 except Exception as e: 

-

778 print("Cannot load the pyramid from its descriptor and read the list") 

-

779 

-

780 Yields: 

-

781 Iterator[Tuple[Tuple[SlabType,str,int,int], Dict]]: Slab indices and storage informations 

-

782 

-

783 Value example: 

-

784 

-

785 ( 

-

786 (<SlabType.DATA: 'DATA'>, '18', 5424, 7526), 

-

787 { 

-

788 'link': False, 

-

789 'md5': None, 

-

790 'root': 'pyramids@localhost:9000/LIMADM', 

-

791 'slab': 'DATA_18_5424_7526' 

-

792 } 

-

793 ) 

-

794 

-

795 Raises: 

-

796 StorageError: Unhandled pyramid storage to copy list 

-

797 MissingEnvironmentError: Missing object storage informations 

-

798 """ 

-

799 if self.__content["loaded"]: 

-

800 for slab, infos in self.__content["cache"].items(): 

-

801 if level_id is not None: 

-

802 if slab[1] == level_id: 

-

803 yield slab, infos 

-

804 else: 

-

805 yield slab, infos 

-

806 else: 

-

807 # Copie de la liste dans un fichier temporaire (cette liste peut être un objet) 

-

808 list_obj = tempfile.NamedTemporaryFile(mode="r", delete=False) 

-

809 list_file = list_obj.name 

-

810 copy(self.__list, f"file://{list_file}") 

-

811 list_obj.close() 

-

812 

-

813 roots = {} 

-

814 s3_cluster = self.storage_s3_cluster 

-

815 

-

816 with open(list_file) as listin: 

-

817 # Lecture des racines 

-

818 for line in listin: 

-

819 line = line.rstrip() 

-

820 

-

821 if line == "#": 

-

822 break 

-

823 

-

824 root_id, root_path = line.split("=", 1) 

-

825 

-

826 if s3_cluster is None: 

-

827 roots[root_id] = root_path 

-

828 else: 

-

829 # On a un nom de cluster S3, on l'ajoute au nom du bucket dans les racines 

-

830 root_bucket, root_path = root_path.split("/", 1) 

-

831 roots[root_id] = f"{root_bucket}@{s3_cluster}/{root_path}" 

-

832 

-

833 # Lecture des dalles 

-

834 for line in listin: 

-

835 line = line.rstrip() 

-

836 

-

837 parts = line.split(" ", 1) 

-

838 slab_path = parts[0] 

-

839 slab_md5 = None 

-

840 if len(parts) == 2: 

-

841 slab_md5 = parts[1] 

-

842 

-

843 root_id, slab_path = slab_path.split("/", 1) 

-

844 

-

845 slab_type, level, column, row = self.get_infos_from_slab_path(slab_path) 

-

846 infos = { 

-

847 "root": roots[root_id], 

-

848 "link": root_id != "0", 

-

849 "slab": slab_path, 

-

850 "md5": slab_md5, 

-

851 } 

-

852 

-

853 if level_id is not None: 

-

854 if level == level_id: 

-

855 yield ((slab_type, level, column, row), infos) 

-

856 else: 

-

857 yield ((slab_type, level, column, row), infos) 

-

858 

-

859 remove(f"file://{list_file}") 

-

860 

-

861 def get_level(self, level_id: str) -> "Level": 

-

862 """Get one level according to its identifier 

-

863 

-

864 Args: 

-

865 level_id: Level identifier 

-

866 

-

867 Returns: 

-

868 The corresponding pyramid's level, None if not present 

-

869 """ 

-

870 

-

871 return self.__levels.get(level_id, None) 

-

872 

-

873 def get_levels(self, bottom_id: str = None, top_id: str = None) -> List[Level]: 

-

874 """Get sorted levels in the provided range from bottom to top 

-

875 

-

876 Args: 

-

877 bottom_id (str, optionnal): specific bottom level id. Defaults to None. 

-

878 top_id (str, optionnal): specific top level id. Defaults to None. 

-

879 

-

880 Raises: 

-

881 Exception: Provided levels are not consistent (bottom > top or not in the pyramid) 

-

882 

-

883 Examples: 

-

884 

-

885 All levels 

-

886 

-

887 from rok4.pyramid import Pyramid 

-

888 

-

889 try: 

-

890 pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json") 

-

891 levels = pyramid.get_levels() 

-

892 

-

893 except Exception as e: 

-

894 print("Cannot load the pyramid from its descriptor and get levels") 

-

895 

-

896 From pyramid's bottom to provided top (level 5) 

-

897 

-

898 from rok4.pyramid import Pyramid 

-

899 

-

900 try: 

-

901 pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json") 

-

902 levels = pyramid.get_levels(None, "5") 

-

903 

-

904 except Exception as e: 

-

905 print("Cannot load the pyramid from its descriptor and get levels") 

-

906 

-

907 Returns: 

-

908 List[Level]: asked sorted levels 

-

909 """ 

-

910 

-

911 sorted_levels = sorted(self.__levels.values(), key=lambda level: level.resolution) 

-

912 

-

913 levels = [] 

-

914 

-

915 begin = False 

-

916 if bottom_id is None: 

-

917 # Pas de niveau du bas fourni, on commence tout en bas 

-

918 begin = True 

-

919 else: 

-

920 if self.get_level(bottom_id) is None: 

-

921 raise Exception( 

-

922 f"Pyramid {self.name} does not contain the provided bottom level {bottom_id}" 

-

923 ) 

-

924 

-

925 if top_id is not None and self.get_level(top_id) is None: 

-

926 raise Exception(f"Pyramid {self.name} does not contain the provided top level {top_id}") 

-

927 

-

928 end = False 

-

929 

-

930 for level in sorted_levels: 

-

931 if not begin and level.id == bottom_id: 

-

932 begin = True 

-

933 

-

934 if begin: 

-

935 levels.append(level) 

-

936 if top_id is not None and level.id == top_id: 

-

937 end = True 

-

938 break 

-

939 else: 

-

940 continue 

-

941 

-

942 if top_id is None: 

-

943 # Pas de niveau du haut fourni, on a été jusqu'en haut et c'est normal 

-

944 end = True 

-

945 

-

946 if not begin or not end: 

-

947 raise Exception( 

-

948 f"Provided levels ids are not consistent to extract levels from the pyramid {self.name}" 

-

949 ) 

-

950 

-

951 return levels 

-

952 

-

953 def write_descriptor(self) -> None: 

-

954 """Write the pyramid's descriptor to the final location (in the pyramid's storage root)""" 

-

955 

-

956 content = json.dumps(self.serializable) 

-

957 put_data_str(content, self.__descriptor) 

-

958 

-

959 def get_infos_from_slab_path(self, path: str) -> Tuple[SlabType, str, int, int]: 

-

960 """Get the slab's indices from its storage path 

-

961 

-

962 Args: 

-

963 path (str): Slab's storage path 

-

964 

-

965 Examples: 

-

966 

-

967 FILE stored pyramid 

-

968 

-

969 from rok4.pyramid import Pyramid 

-

970 

-

971 try: 

-

972 pyramid = Pyramid.from_descriptor("/path/to/descriptor.json") 

-

973 slab_type, level, column, row = self.get_infos_from_slab_path("DATA/12/00/4A/F7.tif") 

-

974 # (SlabType.DATA, "12", 159, 367) 

-

975 except Exception as e: 

-

976 print("Cannot load the pyramid from its descriptor and convert a slab path") 

-

977 

-

978 S3 stored pyramid 

-

979 

-

980 from rok4.pyramid import Pyramid 

-

981 

-

982 try: 

-

983 pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/pyramid.json") 

-

984 slab_type, level, column, row = self.get_infos_from_slab_path("s3://bucket_name/path/to/pyramid/MASK_15_9164_5846") 

-

985 # (SlabType.MASK, "15", 9164, 5846) 

-

986 except Exception as e: 

-

987 print("Cannot load the pyramid from its descriptor and convert a slab path") 

-

988 

-

989 Returns: 

-

990 Tuple[SlabType, str, int, int]: Slab's type (DATA or MASK), level identifier, slab's column and slab's row 

-

991 """ 

-

992 if self.__storage["type"] == StorageType.FILE: 

-

993 parts = path.split("/") 

-

994 

-

995 # Le partie du chemin qui contient la colonne et ligne de la dalle est à la fin, en fonction de la profondeur choisie 

-

996 # depth = 2 -> on doit utiliser les 3 dernières parties pour la conversion 

-

997 column, row = b36_path_decode("/".join(parts[-(self.__storage["depth"] + 1) :])) 

-

998 level = parts[-(self.__storage["depth"] + 2)] 

-

999 raw_slab_type = parts[-(self.__storage["depth"] + 3)] 

-

1000 

-

1001 # Pour être retro compatible avec l'ancien nommage 

-

1002 if raw_slab_type == "IMAGE": 

-

1003 raw_slab_type = "DATA" 

-

1004 

-

1005 slab_type = SlabType[raw_slab_type] 

-

1006 

-

1007 return slab_type, level, column, row 

-

1008 else: 

-

1009 parts = re.split(r"[/_]", path) 

-

1010 column = parts[-2] 

-

1011 row = parts[-1] 

-

1012 level = parts[-3] 

-

1013 raw_slab_type = parts[-4] 

-

1014 

-

1015 # Pour être retro compatible avec l'ancien nommage 

-

1016 if raw_slab_type == "IMG": 

-

1017 raw_slab_type = "DATA" 

-

1018 elif raw_slab_type == "MSK": 

-

1019 raw_slab_type = "MASK" 

-

1020 

-

1021 slab_type = SlabType[raw_slab_type] 

-

1022 

-

1023 return slab_type, level, int(column), int(row) 

-

1024 

-

1025 def get_slab_path_from_infos( 

-

1026 self, slab_type: SlabType, level: str, column: int, row: int, full: bool = True 

-

1027 ) -> str: 

-

1028 """Get slab's storage path from the indices 

-

1029 

-

1030 Args: 

-

1031 slab_type (SlabType): DATA or MASK 

-

1032 level (str): Level identifier 

-

1033 column (int): Slab's column 

-

1034 row (int): Slab's row 

-

1035 full (bool, optional): Full path or just relative path from pyramid storage root. Defaults to True. 

-

1036 

-

1037 Returns: 

-

1038 str: Absolute or relative slab's storage path 

-

1039 """ 

-

1040 if self.__storage["type"] == StorageType.FILE: 

-

1041 slab_path = os.path.join( 

-

1042 slab_type.value, level, b36_path_encode(column, row, self.__storage["depth"]) 

-

1043 ) 

-

1044 else: 

-

1045 slab_path = f"{slab_type.value}_{level}_{column}_{row}" 

-

1046 

-

1047 if full: 

-

1048 return get_path_from_infos( 

-

1049 self.__storage["type"], self.__storage["root"], self.__name, slab_path 

-

1050 ) 

-

1051 else: 

-

1052 return slab_path 

-

1053 

-

1054 def get_tile_data_binary(self, level: str, column: int, row: int) -> str: 

-

1055 """Get a pyramid's tile as binary string 

-

1056 

-

1057 To get a tile, 3 steps : 

-

1058 * calculate slab path from tile index 

-

1059 * read slab index to get offsets and sizes of slab's tiles 

-

1060 * read the tile into the slab 

-

1061 

-

1062 Args: 

-

1063 level (str): Tile's level 

-

1064 column (int): Tile's column 

-

1065 row (int): Tile's row 

-

1066 

-

1067 Limitations: 

-

1068 Pyramids with one-tile slab are not handled 

-

1069 

-

1070 Examples: 

-

1071 

-

1072 FILE stored raster pyramid, to extract a tile containing a point and save it as independent image 

-

1073 

-

1074 from rok4.pyramid import Pyramid 

-

1075 

-

1076 try: 

-

1077 pyramid = Pyramid.from_descriptor("/data/pyramids/SCAN1000.json") 

-

1078 level, col, row, pcol, prow = pyramid.get_tile_indices(992904.46, 6733643.15, "9", srs = "IGNF:LAMB93") 

-

1079 data = pyramid.get_tile_data_binary(level, col, row) 

-

1080 

-

1081 if data is None: 

-

1082 print("No data") 

-

1083 else: 

-

1084 tile_name = f"tile_{level}_{col}_{row}.{pyramid.tile_extension}" 

-

1085 with open(tile_name, "wb") as image: 

-

1086 image.write(data) 

-

1087 print (f"Tile written in {tile_name}") 

-

1088 

-

1089 except Exception as e: 

-

1090 print("Cannot save a pyramid's tile : {e}") 

-

1091 

-

1092 Raises: 

-

1093 Exception: Level not found in the pyramid 

-

1094 NotImplementedError: Pyramid owns one-tile slabs 

-

1095 MissingEnvironmentError: Missing object storage informations 

-

1096 StorageError: Storage read issue 

-

1097 

-

1098 Returns: 

-

1099 str: data, as binary string, None if no data 

-

1100 """ 

-

1101 

-

1102 level_object = self.get_level(level) 

-

1103 

-

1104 if level_object is None: 

-

1105 raise Exception(f"No level {level} in the pyramid") 

-

1106 

-

1107 if level_object.slab_width == 1 and level_object.slab_height == 1: 

-

1108 raise NotImplementedError("One-tile slab pyramid is not handled") 

-

1109 

-

1110 if not level_object.is_in_limits(column, row): 

-

1111 return None 

-

1112 

-

1113 # Indices de la dalle 

-

1114 slab_column = column // level_object.slab_width 

-

1115 slab_row = row // level_object.slab_height 

-

1116 

-

1117 # Indices de la tuile dans la dalle 

-

1118 relative_tile_column = column % level_object.slab_width 

-

1119 relative_tile_row = row % level_object.slab_height 

-

1120 

-

1121 # Numéro de la tuile dans le header 

-

1122 tile_index = relative_tile_row * level_object.slab_width + relative_tile_column 

-

1123 

-

1124 # Calcul du chemin de la dalle contenant la tuile voulue 

-

1125 slab_path = self.get_slab_path_from_infos(SlabType.DATA, level, slab_column, slab_row) 

-

1126 

-

1127 # Récupération des offset et tailles des tuiles dans la dalle 

-

1128 # Une dalle ROK4 a une en-tête fixe de 2048 octets, 

-

1129 # puis sont stockés les offsets (chacun sur 4 octets) 

-

1130 # puis les tailles (chacune sur 4 octets) 

-

1131 try: 

-

1132 binary_index = get_data_binary( 

-

1133 slab_path, 

-

1134 ( 

-

1135 ROK4_IMAGE_HEADER_SIZE, 

-

1136 2 * 4 * level_object.slab_width * level_object.slab_height, 

-

1137 ), 

-

1138 ) 

-

1139 except FileNotFoundError: 

-

1140 # L'absence de la dalle est gérée comme simplement une absence de données 

-

1141 return None 

-

1142 

-

1143 offsets = numpy.frombuffer( 

-

1144 binary_index, 

-

1145 dtype=numpy.dtype("uint32"), 

-

1146 count=level_object.slab_width * level_object.slab_height, 

-

1147 ) 

-

1148 sizes = numpy.frombuffer( 

-

1149 binary_index, 

-

1150 dtype=numpy.dtype("uint32"), 

-

1151 offset=4 * level_object.slab_width * level_object.slab_height, 

-

1152 count=level_object.slab_width * level_object.slab_height, 

-

1153 ) 

-

1154 

-

1155 if sizes[tile_index] == 0: 

-

1156 return None 

-

1157 

-

1158 return get_data_binary(slab_path, (offsets[tile_index], sizes[tile_index])) 

-

1159 

-

1160 def get_tile_data_raster(self, level: str, column: int, row: int) -> numpy.ndarray: 

-

1161 """Get a raster pyramid's tile as 3-dimension numpy ndarray 

-

1162 

-

1163 First dimension is the row, second one is column, third one is band. 

-

1164 

-

1165 Args: 

-

1166 level (str): Tile's level 

-

1167 column (int): Tile's column 

-

1168 row (int): Tile's row 

-

1169 

-

1170 Limitations: 

-

1171 Packbits (pyramid formats TIFF_PKB_FLOAT32 and TIFF_PKB_UINT8) and LZW (pyramid formats TIFF_LZW_FLOAT32 and TIFF_LZW_UINT8) compressions are not handled. 

-

1172 

-

1173 Raises: 

-

1174 Exception: Cannot get raster data for a vector pyramid 

-

1175 Exception: Level not found in the pyramid 

-

1176 NotImplementedError: Pyramid owns one-tile slabs 

-

1177 NotImplementedError: Raster pyramid format not handled 

-

1178 MissingEnvironmentError: Missing object storage informations 

-

1179 StorageError: Storage read issue 

-

1180 FormatError: Cannot decode tile 

-

1181 

-

1182 Examples: 

-

1183 

-

1184 FILE stored DTM (raster) pyramid, to get the altitude value at a point in the best level 

-

1185 

-

1186 from rok4.pyramid import Pyramid 

-

1187 

-

1188 try: 

-

1189 pyramid = Pyramid.from_descriptor("/data/pyramids/RGEALTI.json") 

-

1190 level, col, row, pcol, prow = pyramid.get_tile_indices(44, 5, srs = "EPSG:4326") 

-

1191 data = pyramid.get_tile_data_raster(level, col, row) 

-

1192 

-

1193 if data is None: 

-

1194 print("No data") 

-

1195 else: 

-

1196 print(data[prow][pcol]) 

-

1197 

-

1198 except Exception as e: 

-

1199 print("Cannot get a pyramid's pixel value : {e}") 

-

1200 

-

1201 Returns: 

-

1202 str: data, as numpy array, None if no data 

-

1203 """ 

-

1204 

-

1205 if self.type == PyramidType.VECTOR: 

-

1206 raise Exception("Cannot get tile as raster data : it's a vector pyramid") 

-

1207 

-

1208 binary_tile = self.get_tile_data_binary(level, column, row) 

-

1209 

-

1210 if binary_tile is None: 

-

1211 return None 

-

1212 

-

1213 level_object = self.get_level(level) 

-

1214 

-

1215 if self.__format == "TIFF_JPG_UINT8" or self.__format == "TIFF_JPG90_UINT8": 

-

1216 try: 

-

1217 img = Image.open(io.BytesIO(binary_tile)) 

-

1218 except Exception as e: 

-

1219 raise FormatError("JPEG", "binary tile", e) 

-

1220 

-

1221 data = numpy.asarray(img) 

-

1222 data.shape = ( 

-

1223 level_object.tile_matrix.tile_size[0], 

-

1224 level_object.tile_matrix.tile_size[1], 

-

1225 self.__raster_specifications["channels"], 

-

1226 ) 

-

1227 

-

1228 elif self.__format == "TIFF_RAW_UINT8": 

-

1229 data = numpy.frombuffer(binary_tile, dtype=numpy.dtype("uint8")) 

-

1230 data.shape = ( 

-

1231 level_object.tile_matrix.tile_size[0], 

-

1232 level_object.tile_matrix.tile_size[1], 

-

1233 self.__raster_specifications["channels"], 

-

1234 ) 

-

1235 

-

1236 elif self.__format == "TIFF_PNG_UINT8": 

-

1237 try: 

-

1238 img = Image.open(io.BytesIO(binary_tile)) 

-

1239 except Exception as e: 

-

1240 raise FormatError("PNG", "binary tile", e) 

-

1241 

-

1242 data = numpy.asarray(img) 

-

1243 data.shape = ( 

-

1244 level_object.tile_matrix.tile_size[0], 

-

1245 level_object.tile_matrix.tile_size[1], 

-

1246 self.__raster_specifications["channels"], 

-

1247 ) 

-

1248 

-

1249 elif self.__format == "TIFF_ZIP_UINT8": 

-

1250 try: 

-

1251 data = numpy.frombuffer(zlib.decompress(binary_tile), dtype=numpy.dtype("uint8")) 

-

1252 except Exception as e: 

-

1253 raise FormatError("ZIP", "binary tile", e) 

-

1254 

-

1255 data.shape = ( 

-

1256 level_object.tile_matrix.tile_size[0], 

-

1257 level_object.tile_matrix.tile_size[1], 

-

1258 self.__raster_specifications["channels"], 

-

1259 ) 

-

1260 

-

1261 elif self.__format == "TIFF_ZIP_FLOAT32": 

-

1262 try: 

-

1263 data = numpy.frombuffer(zlib.decompress(binary_tile), dtype=numpy.dtype("float32")) 

-

1264 except Exception as e: 

-

1265 raise FormatError("ZIP", "binary tile", e) 

-

1266 

-

1267 data.shape = ( 

-

1268 level_object.tile_matrix.tile_size[0], 

-

1269 level_object.tile_matrix.tile_size[1], 

-

1270 self.__raster_specifications["channels"], 

-

1271 ) 

-

1272 

-

1273 elif self.__format == "TIFF_RAW_FLOAT32": 

-

1274 data = numpy.frombuffer(binary_tile, dtype=numpy.dtype("float32")) 

-

1275 data.shape = ( 

-

1276 level_object.tile_matrix.tile_size[0], 

-

1277 level_object.tile_matrix.tile_size[1], 

-

1278 self.__raster_specifications["channels"], 

-

1279 ) 

-

1280 

-

1281 else: 

-

1282 raise NotImplementedError(f"Cannot get tile as raster data for format {self.__format}") 

-

1283 

-

1284 return data 

-

1285 

-

1286 def get_tile_data_vector(self, level: str, column: int, row: int) -> Dict: 

-

1287 """Get a vector pyramid's tile as GeoJSON dictionnary 

-

1288 

-

1289 Args: 

-

1290 level (str): Tile's level 

-

1291 column (int): Tile's column 

-

1292 row (int): Tile's row 

-

1293 

-

1294 Raises: 

-

1295 Exception: Cannot get vector data for a raster pyramid 

-

1296 Exception: Level not found in the pyramid 

-

1297 NotImplementedError: Pyramid owns one-tile slabs 

-

1298 NotImplementedError: Vector pyramid format not handled 

-

1299 MissingEnvironmentError: Missing object storage informations 

-

1300 StorageError: Storage read issue 

-

1301 FormatError: Cannot decode tile 

-

1302 

-

1303 Examples: 

-

1304 

-

1305 S3 stored vector pyramid, to print a tile as GeoJSON 

-

1306 

-

1307 from rok4.pyramid import Pyramid 

-

1308 

-

1309 import json 

-

1310 

-

1311 try: 

-

1312 pyramid = Pyramid.from_descriptor("s3://pyramids/vectors/BDTOPO.json") 

-

1313 level, col, row, pcol, prow = pyramid.get_tile_indices(40.325, 3.123, srs = "EPSG:4326") 

-

1314 data = pyramid.get_tile_data_vector(level, col, row) 

-

1315 

-

1316 if data is None: 

-

1317 print("No data") 

-

1318 else: 

-

1319 print(json.dumps(data)) 

-

1320 

-

1321 except Exception as e: 

-

1322 print("Cannot print a vector pyramid's tile as GeoJSON : {e}") 

-

1323 

-

1324 Returns: 

-

1325 str: data, as GeoJSON dictionnary. None if no data 

-

1326 """ 

-

1327 

-

1328 if self.type == PyramidType.RASTER: 

-

1329 raise Exception("Cannot get tile as vector data : it's a raster pyramid") 

-

1330 

-

1331 binary_tile = self.get_tile_data_binary(level, column, row) 

-

1332 

-

1333 if binary_tile is None: 

-

1334 return None 

-

1335 

-

1336 self.get_level(level) 

-

1337 

-

1338 if self.__format == "TIFF_PBF_MVT": 

-

1339 try: 

-

1340 data = mapbox_vector_tile.decode(binary_tile) 

-

1341 except Exception as e: 

-

1342 raise FormatError("PBF (MVT)", "binary tile", e) 

-

1343 else: 

-

1344 raise NotImplementedError(f"Cannot get tile as vector data for format {self.__format}") 

-

1345 

-

1346 return data 

-

1347 

-

1348 def get_tile_indices( 

-

1349 self, x: float, y: float, level: str = None, **kwargs 

-

1350 ) -> Tuple[str, int, int, int, int]: 

-

1351 """Get pyramid's tile and pixel indices from point's coordinates 

-

1352 

-

1353 Used coordinates system have to be the pyramid one. If EPSG:4326, x is latitude and y longitude. 

-

1354 

-

1355 Args: 

-

1356 x (float): point's x 

-

1357 y (float): point's y 

-

1358 level (str, optional): Pyramid's level to take into account, the bottom one if None . Defaults to None. 

-

1359 **srs (string): spatial reference system of provided coordinates, with authority and code (same as the pyramid's one if not provided) 

-

1360 

-

1361 Raises: 

-

1362 Exception: Cannot find level to calculate indices 

-

1363 RuntimeError: Provided SRS is invalid for OSR 

-

1364 

-

1365 Examples: 

-

1366 

-

1367 FILE stored DTM (raster) pyramid, to get the altitude value at a point in the best level 

-

1368 

-

1369 from rok4.pyramid import Pyramid 

-

1370 

-

1371 try: 

-

1372 pyramid = Pyramid.from_descriptor("/data/pyramids/RGEALTI.json") 

-

1373 level, col, row, pcol, prow = pyramid.get_tile_indices(44, 5, srs = "EPSG:4326") 

-

1374 data = pyramid.get_tile_data_raster(level, col, row) 

-

1375 

-

1376 if data is None: 

-

1377 print("No data") 

-

1378 else: 

-

1379 print(data[prow][pcol]) 

-

1380 

-

1381 except Exception as e: 

-

1382 print("Cannot get a pyramid's pixel value : {e}") 

-

1383 

-

1384 Returns: 

-

1385 Tuple[str, int, int, int, int]: Level identifier, tile's column, tile's row, pixel's (in the tile) column, pixel's row 

-

1386 """ 

-

1387 

-

1388 level_object = self.bottom_level 

-

1389 if level is not None: 

-

1390 level_object = self.get_level(level) 

-

1391 

-

1392 if level_object is None: 

-

1393 raise Exception("Cannot found the level to calculate indices") 

-

1394 

-

1395 if ( 

-

1396 "srs" in kwargs 

-

1397 and kwargs["srs"] is not None 

-

1398 and kwargs["srs"].upper() != self.__tms.srs.upper() 

-

1399 ): 

-

1400 sr = srs_to_spatialreference(kwargs["srs"]) 

-

1401 x, y = reproject_point((x, y), sr, self.__tms.sr) 

-

1402 

-

1403 return (level_object.id,) + level_object.tile_matrix.point_to_indices(x, y) 

-

1404 

-

1405 def delete_level(self, level_id: str) -> None: 

-

1406 """Delete the given level in the description of the pyramid 

-

1407 

-

1408 Args: 

-

1409 level_id: Level identifier 

-

1410 

-

1411 Raises: 

-

1412 Exception: Cannot find level 

-

1413 """ 

-

1414 

-

1415 try: 

-

1416 del self.__levels[level_id] 

-

1417 except Exception: 

-

1418 raise Exception(f"The level {level_id} don't exist in the pyramid") 

-

1419 

-

1420 def add_level( 

-

1421 self, 

-

1422 level_id: str, 

-

1423 tiles_per_width: int, 

-

1424 tiles_per_height: int, 

-

1425 tile_limits: Dict[str, int], 

-

1426 ) -> None: 

-

1427 """Add a level in the description of the pyramid 

-

1428 

-

1429 Args: 

-

1430 level_id: Level identifier 

-

1431 tiles_per_width : Number of tiles in width by slab 

-

1432 tiles_per_height : Number of tiles in height by slab 

-

1433 tile_limits : Minimum and maximum tiles' columns and rows of pyramid's content 

-

1434 """ 

-

1435 

-

1436 data = { 

-

1437 "id": level_id, 

-

1438 "tile_limits": tile_limits, 

-

1439 "tiles_per_width": tiles_per_width, 

-

1440 "tiles_per_height": tiles_per_height, 

-

1441 "storage": {"type": self.storage_type.name}, 

-

1442 } 

-

1443 if self.own_masks: 

-

1444 data["storage"]["mask_prefix"] = True 

-

1445 if self.storage_type == StorageType.FILE: 

-

1446 data["storage"]["path_depth"] = self.storage_depth 

-

1447 

-

1448 lev = Level.from_descriptor(data, self) 

-

1449 

-

1450 if self.__tms.get_level(lev.id) is None: 

-

1451 raise Exception( 

-

1452 f"Pyramid {self.name} owns a level with the ID '{lev.id}', not defined in the TMS '{self.tms.name}'" 

-

1453 ) 

-

1454 else: 

-

1455 self.__levels[lev.id] = lev 

-

1456 

-

1457 @property 

-

1458 def size(self) -> int: 

-

1459 """Get the size of the pyramid 

-

1460 

-

1461 Examples: 

-

1462 

-

1463 from rok4.pyramid import Pyramid 

-

1464 

-

1465 try: 

-

1466 pyramid = Pyramid.from_descriptor("s3://bucket_name/path/to/descriptor.json") 

-

1467 size = pyramid.size() 

-

1468 

-

1469 except Exception as e: 

-

1470 print("Cannot load the pyramid from its descriptor and get his size") 

-

1471 

-

1472 Returns: 

-

1473 int: size of the pyramid 

-

1474 """ 

-

1475 

-

1476 if not hasattr(self, "_Pyramid__size"): 

-

1477 self.__size = size_path( 

-

1478 get_path_from_infos(self.__storage["type"], self.__storage["root"], self.__name) 

-

1479 ) 

-

1480 

-

1481 return self.__size 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_raster_py.html b/2.2.2/tests/z_4cdda0aa429327c0_raster_py.html deleted file mode 100644 index 1e86522..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_raster_py.html +++ /dev/null @@ -1,456 +0,0 @@ - - - - - Coverage for src/rok4/raster.py: 97% - - - - - -
-
-

- Coverage for src/rok4/raster.py: - 97% -

- -

- 126 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1"""Provide functions to read information on raster data 

-

2 

-

3The module contains the following class : 

-

4 

-

5- `Raster` - Structure describing raster data. 

-

6- `RasterSet` - Structure describing a set of raster data. 

-

7""" 

-

8 

-

9# -- IMPORTS -- 

-

10 

-

11import json 

-

12import re 

-

13import tempfile 

-

14 

-

15# standard library 

-

16from copy import deepcopy 

-

17from json.decoder import JSONDecodeError 

-

18from typing import Dict, Tuple 

-

19 

-

20# 3rd party 

-

21from osgeo import gdal, ogr 

-

22 

-

23# package 

-

24from rok4.enums import ColorFormat 

-

25from rok4.storage import ( 

-

26 copy, 

-

27 exists, 

-

28 get_data_str, 

-

29 get_osgeo_path, 

-

30 put_data_str, 

-

31 remove, 

-

32) 

-

33from rok4.utils import compute_bbox, compute_format 

-

34 

-

35# -- GLOBALS -- 

-

36 

-

37# Enable GDAL/OGR exceptions 

-

38ogr.UseExceptions() 

-

39gdal.UseExceptions() 

-

40 

-

41 

-

42class Raster: 

-

43 """A structure describing raster data 

-

44 

-

45 Attributes: 

-

46 path (str): path to the file/object (ex: file:///path/to/image.tif or s3://bucket/path/to/image.tif) 

-

47 bbox (Tuple[float, float, float, float]): bounding rectange in the data projection 

-

48 bands (int): number of color bands (or channels) format (ColorFormat). Numeric variable format for color values. Bit depth, as bits per channel, 

-

49 can be derived from it. 

-

50 mask (str): path to the associated mask file or object, if any, or None (same path as the image, but with a ".msk" extension and TIFF format. 

-

51 Ex: file:///path/to/image.msk or s3://bucket/path/to/image.msk) 

-

52 dimensions (Tuple[int, int]): image width and height, in pixels 

-

53 """ 

-

54 

-

55 def __init__(self) -> None: 

-

56 self.bands = None 

-

57 self.bbox = (None, None, None, None) 

-

58 self.dimensions = (None, None) 

-

59 self.format = None 

-

60 self.mask = None 

-

61 self.path = None 

-

62 

-

63 @classmethod 

-

64 def from_file(cls, path: str) -> "Raster": 

-

65 """Creates a Raster object from an image 

-

66 

-

67 Args: 

-

68 path (str): path to the image file/object 

-

69 

-

70 Examples: 

-

71 

-

72 Loading informations from a file stored raster TIFF image 

-

73 

-

74 from rok4.raster import Raster 

-

75 

-

76 try: 

-

77 raster = Raster.from_file( 

-

78 "file:///data/SC1000/0040_6150_L93.tif" 

-

79 ) 

-

80 

-

81 except Exception as e: 

-

82 print(f"Cannot load information from image : {e}") 

-

83 

-

84 Raises: 

-

85 FormatError: MASK file is not a TIFF 

-

86 RuntimeError: raised by OGR/GDAL if anything goes wrong 

-

87 NotImplementedError: Storage type not handled 

-

88 FileNotFoundError: File or object does not exists 

-

89 

-

90 Returns: 

-

91 Raster: a Raster instance 

-

92 """ 

-

93 if not exists(path): 

-

94 raise FileNotFoundError(f"No file or object found at path '{path}'.") 

-

95 

-

96 self = cls() 

-

97 

-

98 work_image_path = get_osgeo_path(path) 

-

99 

-

100 image_datasource = gdal.Open(work_image_path) 

-

101 self.path = path 

-

102 

-

103 path_pattern = re.compile("(/[^/]+?)[.][a-zA-Z0-9_-]+$") 

-

104 mask_path = path_pattern.sub("\\1.msk", path) 

-

105 

-

106 if exists(mask_path): 

-

107 work_mask_path = get_osgeo_path(mask_path) 

-

108 mask_driver = gdal.IdentifyDriver(work_mask_path).ShortName 

-

109 if "GTiff" != mask_driver: 

-

110 message = f"Mask file '{mask_path}' use GDAL driver : '{mask_driver}'" 

-

111 raise FormatError("TIFF", mask_path, message) 

-

112 self.mask = mask_path 

-

113 else: 

-

114 self.mask = None 

-

115 

-

116 self.bbox = compute_bbox(image_datasource) 

-

117 self.bands = image_datasource.RasterCount 

-

118 self.format = compute_format(image_datasource, path) 

-

119 self.dimensions = (image_datasource.RasterXSize, image_datasource.RasterYSize) 

-

120 

-

121 return self 

-

122 

-

123 @classmethod 

-

124 def from_parameters( 

-

125 cls, 

-

126 path: str, 

-

127 bands: int, 

-

128 bbox: Tuple[float, float, float, float], 

-

129 dimensions: Tuple[int, int], 

-

130 format: ColorFormat, 

-

131 mask: str = None, 

-

132 ) -> "Raster": 

-

133 """Creates a Raster object from parameters 

-

134 

-

135 Args: 

-

136 path (str): path to the file/object (ex: file:///path/to/image.tif or s3://bucket/image.tif) 

-

137 bands (int): number of color bands (or channels) 

-

138 bbox (Tuple[float, float, float, float]): bounding rectange in the data projection 

-

139 dimensions (Tuple[int, int]): image width and height expressed in pixels 

-

140 format (ColorFormat): numeric format for color values. Bit depth, as bits per channel, can be derived from it. 

-

141 mask (str, optionnal): path to the associated mask, if any, or None (same path as the image, but with a ".msk" 

-

142 extension and TIFF format. ex: file:///path/to/image.msk or s3://bucket/image.msk) 

-

143 

-

144 Examples: 

-

145 

-

146 Loading informations from parameters, related to 

-

147 a TIFF main image coupled to a TIFF mask image 

-

148 

-

149 from rok4.raster import Raster 

-

150 

-

151 try: 

-

152 raster = Raster.from_parameters( 

-

153 path="file:///data/SC1000/_0040_6150_L93.tif", 

-

154 mask="file:///data/SC1000/0040_6150_L93.msk", 

-

155 bands=3, 

-

156 format=ColorFormat.UINT8, 

-

157 dimensions=(2000, 2000), 

-

158 bbox=(40000.000, 5950000.000, 240000.000, 6150000.000) 

-

159 ) 

-

160 

-

161 except Exception as e: 

-

162 print( 

-

163 f"Cannot load information from parameters : {e}" 

-

164 ) 

-

165 

-

166 Raises: 

-

167 KeyError: a mandatory argument is missing 

-

168 

-

169 Returns: 

-

170 Raster: a Raster instance 

-

171 """ 

-

172 self = cls() 

-

173 

-

174 self.path = path 

-

175 self.bands = bands 

-

176 self.bbox = bbox 

-

177 self.dimensions = dimensions 

-

178 self.format = format 

-

179 self.mask = mask 

-

180 return self 

-

181 

-

182 

-

183class RasterSet: 

-

184 """A structure describing a set of raster data 

-

185 

-

186 Attributes: 

-

187 raster_list (List[Raster]): List of Raster instances in the set 

-

188 colors (Set[Tuple[int, ColorFormat]]): Set (distinct values) of color properties (bands and format) found in the raster set. 

-

189 srs (str): Name of the set's spatial reference system 

-

190 bbox (Tuple[float, float, float, float]): bounding rectange in the data projection, enclosing the whole set 

-

191 """ 

-

192 

-

193 def __init__(self) -> None: 

-

194 self.bbox = (None, None, None, None) 

-

195 self.colors = set() 

-

196 self.raster_list = [] 

-

197 self.srs = None 

-

198 

-

199 @classmethod 

-

200 def from_list(cls, path: str, srs: str) -> "RasterSet": 

-

201 """Instanciate a RasterSet from an images list path and a srs 

-

202 

-

203 Args: 

-

204 path (str): path to the images list file or object (each line in this list contains the path to an image file or object in the set) 

-

205 srs (str): images' coordinates system 

-

206 

-

207 Examples: 

-

208 

-

209 Loading informations from a file stored list 

-

210 

-

211 from rok4.raster import RasterSet 

-

212 

-

213 try: 

-

214 raster_set = RasterSet.from_list( 

-

215 path="file:///data/SC1000.list", 

-

216 srs="EPSG:3857" 

-

217 ) 

-

218 

-

219 except Exception as e: 

-

220 print( 

-

221 f"Cannot load information from list file : {e}" 

-

222 ) 

-

223 

-

224 Raises: 

-

225 RuntimeError: raised by OGR/GDAL if anything goes wrong 

-

226 NotImplementedError: Storage type not handled 

-

227 

-

228 Returns: 

-

229 RasterSet: a RasterSet instance 

-

230 """ 

-

231 self = cls() 

-

232 self.srs = srs 

-

233 

-

234 # Chargement de la liste des images (la liste peut être un fichier ou un objet) 

-

235 list_obj = tempfile.NamedTemporaryFile(mode="r", delete=False) 

-

236 list_file = list_obj.name 

-

237 copy(path, f"file://{list_file}") 

-

238 list_obj.close() 

-

239 image_list = [] 

-

240 with open(list_file) as listin: 

-

241 for line in listin: 

-

242 image_path = line.strip(" \t\n\r") 

-

243 image_list.append(image_path) 

-

244 

-

245 remove(f"file://{list_file}") 

-

246 

-

247 bbox = [None, None, None, None] 

-

248 for image_path in image_list: 

-

249 raster = Raster.from_file(image_path) 

-

250 self.raster_list.append(raster) 

-

251 

-

252 # Mise à jour de la bbox globale 

-

253 if bbox == [None, None, None, None]: 

-

254 bbox = list(raster.bbox) 

-

255 else: 

-

256 if bbox[0] > raster.bbox[0]: 

-

257 bbox[0] = raster.bbox[0] 

-

258 if bbox[1] > raster.bbox[1]: 

-

259 bbox[1] = raster.bbox[1] 

-

260 if bbox[2] < raster.bbox[2]: 

-

261 bbox[2] = raster.bbox[2] 

-

262 if bbox[3] < raster.bbox[3]: 

-

263 bbox[3] = raster.bbox[3] 

-

264 

-

265 # Inventaire des colors distinctes 

-

266 self.colors.add((raster.bands, raster.format)) 

-

267 

-

268 self.bbox = tuple(bbox) 

-

269 

-

270 return self 

-

271 

-

272 @classmethod 

-

273 def from_descriptor(cls, path: str) -> "RasterSet": 

-

274 """Creates a RasterSet object from a descriptor file or object 

-

275 

-

276 Args: 

-

277 path (str): path to the descriptor file or object 

-

278 

-

279 Examples: 

-

280 

-

281 Loading informations from a file stored descriptor 

-

282 

-

283 from rok4.raster import RasterSet 

-

284 

-

285 try: 

-

286 raster_set = RasterSet.from_descriptor( 

-

287 "file:///data/images/descriptor.json" 

-

288 ) 

-

289 

-

290 except Exception as e: 

-

291 message = ("Cannot load information from descriptor file : {e}") 

-

292 print(message) 

-

293 

-

294 Raises: 

-

295 RuntimeError: raised by OGR/GDAL if anything goes wrong 

-

296 NotImplementedError: Storage type not handled 

-

297 

-

298 Returns: 

-

299 RasterSet: a RasterSet instance 

-

300 """ 

-

301 self = cls() 

-

302 

-

303 try: 

-

304 serialization = json.loads(get_data_str(path)) 

-

305 

-

306 except JSONDecodeError as e: 

-

307 raise FormatError("JSON", path, e) 

-

308 

-

309 self.srs = serialization["srs"] 

-

310 self.raster_list = [] 

-

311 for raster_dict in serialization["raster_list"]: 

-

312 parameters = deepcopy(raster_dict) 

-

313 parameters["bbox"] = tuple(raster_dict["bbox"]) 

-

314 parameters["dimensions"] = tuple(raster_dict["dimensions"]) 

-

315 parameters["format"] = ColorFormat[raster_dict["format"]] 

-

316 self.raster_list.append(Raster.from_parameters(**parameters)) 

-

317 

-

318 self.bbox = tuple(serialization["bbox"]) 

-

319 for color_dict in serialization["colors"]: 

-

320 self.colors.add((color_dict["bands"], ColorFormat[color_dict["format"]])) 

-

321 

-

322 return self 

-

323 

-

324 @property 

-

325 def serializable(self) -> Dict: 

-

326 """Get the dict version of the raster set, descriptor compliant 

-

327 

-

328 Returns: 

-

329 Dict: descriptor structured object description 

-

330 """ 

-

331 serialization = {"bbox": list(self.bbox), "srs": self.srs, "colors": [], "raster_list": []} 

-

332 for color in self.colors: 

-

333 color_serial = {"bands": color[0], "format": color[1].name} 

-

334 serialization["colors"].append(color_serial) 

-

335 for raster in self.raster_list: 

-

336 raster_dict = { 

-

337 "path": raster.path, 

-

338 "dimensions": list(raster.dimensions), 

-

339 "bbox": list(raster.bbox), 

-

340 "bands": raster.bands, 

-

341 "format": raster.format.name, 

-

342 } 

-

343 if raster.mask is not None: 

-

344 raster_dict["mask"] = raster.mask 

-

345 serialization["raster_list"].append(raster_dict) 

-

346 

-

347 return serialization 

-

348 

-

349 def write_descriptor(self, path: str = None) -> None: 

-

350 """Print raster set's descriptor as JSON 

-

351 

-

352 Args: 

-

353 path (str, optional): Complete path (file or object) where to print the raster set's JSON. Defaults to None, JSON is printed to standard output. 

-

354 """ 

-

355 content = json.dumps(self.serializable, sort_keys=True) 

-

356 if path is None: 

-

357 print(content) 

-

358 else: 

-

359 put_data_str(content, path) 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_storage_py.html b/2.2.2/tests/z_4cdda0aa429327c0_storage_py.html deleted file mode 100644 index 72538bb..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_storage_py.html +++ /dev/null @@ -1,1256 +0,0 @@ - - - - - Coverage for src/rok4/storage.py: 80% - - - - - -
-
-

- Coverage for src/rok4/storage.py: - 80% -

- -

- 538 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1"""Provide functions to read or write data 

-

2 

-

3Available storage types are : 

-

4 

-

5- S3 (path are preffixed with `s3://`) 

-

6- CEPH (path are prefixed with `ceph://`) 

-

7- FILE (path are prefixed with `file://`, but it is the default paths' interpretation) 

-

8- HTTP (path are prefixed with `http://`) 

-

9- HTTPS (path are prefixed with `https://`) 

-

10 

-

11According to functions, all storage types are not necessarily available. 

-

12 

-

13Readings uses a LRU cache system with a TTL. It's possible to configure it with environment variables : 

-

14 

-

15- ROK4_READING_LRU_CACHE_SIZE : Number of cached element. Default 64. Set 0 or a negative integer to configure a cache without bound. A power of two make cache more efficient. 

-

16- ROK4_READING_LRU_CACHE_TTL : Validity duration of cached element, in seconds. Default 300. 0 or negative integer to get cache without expiration date. 

-

17 

-

18To disable cache (always read data on storage), set ROK4_READING_LRU_CACHE_SIZE to 1 and ROK4_READING_LRU_CACHE_TTL to 1. 

-

19 

-

20Using CEPH storage requires environment variables : 

-

21 

-

22- ROK4_CEPH_CONFFILE 

-

23- ROK4_CEPH_USERNAME 

-

24- ROK4_CEPH_CLUSTERNAME 

-

25 

-

26Using S3 storage requires environment variables : 

-

27 

-

28- ROK4_S3_KEY 

-

29- ROK4_S3_SECRETKEY 

-

30- ROK4_S3_URL 

-

31 

-

32To use several S3 clusters, each environment variable have to contain a list (comma-separated), with the same number of elements 

-

33 

-

34Example, work with 2 S3 clusters: 

-

35 

-

36- ROK4_S3_KEY=KEY1,KEY2 

-

37- ROK4_S3_SECRETKEY=SKEY1,SKEY2 

-

38- ROK4_S3_URL=https://s3.storage.fr,https://s4.storage.fr 

-

39 

-

40To precise the cluster to use, bucket name should be bucket_name@s3.storage.fr or bucket_name@s4.storage.fr. If no host is defined (no @) in the bucket name, first S3 cluster is used 

-

41""" 

-

42 

-

43import hashlib 

-

44import os 

-

45import re 

-

46import tempfile 

-

47import time 

-

48from functools import lru_cache 

-

49from shutil import copyfile 

-

50from typing import Dict, Tuple, Union 

-

51 

-

52import boto3 

-

53import botocore.exceptions 

-

54import requests 

-

55 

-

56from osgeo import gdal 

-

57 

-

58# conditional import 

-

59 

-

60try: 

-

61 from osgeo import gdal 

-

62 

-

63 # Enable GDAL/OGR exceptions 

-

64 gdal.UseExceptions() 

-

65 

-

66 GDAL_AVAILABLE: bool = True 

-

67except ImportError: 

-

68 GDAL_AVAILABLE: bool = False 

-

69 gdal = None 

-

70 

-

71 

-

72try: 

-

73 import rados 

-

74 

-

75 CEPH_RADOS_AVAILABLE: bool = True 

-

76except ImportError: 

-

77 CEPH_RADOS_AVAILABLE: bool = False 

-

78 rados = None 

-

79 

-

80# package 

-

81from rok4.enums import StorageType 

-

82from rok4.exceptions import MissingEnvironmentError, StorageError 

-

83 

-

84# -- GLOBALS -- 

-

85 

-

86 

-

87__CEPH_CLIENT = None 

-

88__CEPH_IOCTXS = {} 

-

89__OBJECT_SYMLINK_SIGNATURE = "SYMLINK#" 

-

90__S3_CLIENTS = {} 

-

91__S3_DEFAULT_CLIENT = None 

-

92__LRU_SIZE = 64 

-

93__LRU_TTL = 300 

-

94 

-

95try: 

-

96 __LRU_SIZE = int(os.environ["ROK4_READING_LRU_CACHE_SIZE"]) 

-

97 if __LRU_SIZE < 1: 

-

98 __LRU_SIZE = None 

-

99except ValueError: 

-

100 pass 

-

101except KeyError: 

-

102 pass 

-

103 

-

104try: 

-

105 __LRU_TTL = int(os.environ["ROK4_READING_LRU_CACHE_TTL"]) 

-

106 if __LRU_TTL < 0: 

-

107 __LRU_TTL = 0 

-

108except ValueError: 

-

109 pass 

-

110except KeyError: 

-

111 pass 

-

112 

-

113 

-

114def __get_ttl_hash() -> int: 

-

115 """Return the time string rounded according to time-to-live value""" 

-

116 if __LRU_TTL == 0: 

-

117 return 0 

-

118 else: 

-

119 return round(time.time() / __LRU_TTL) 

-

120 

-

121 

-

122def __get_s3_client(bucket_name: str) -> Tuple[Dict[str, Union["boto3.client", str]], str, str]: 

-

123 """Get the S3 client 

-

124 

-

125 Create it if not already done 

-

126 

-

127 Args: 

-

128 bucket_name (str): S3 bucket name. Could be just the bucket name, or <bucket name>@<cluster host> 

-

129 

-

130 Raises: 

-

131 MissingEnvironmentError: Missing S3 storage informations 

-

132 StorageError: S3 client configuration issue 

-

133 

-

134 Returns: 

-

135 Tuple[Dict[str, Union['boto3.client',str]], str]: the S3 informations (client, host, key, secret) and the simple bucket name 

-

136 """ 

-

137 

-

138 global __S3_CLIENTS, __S3_DEFAULT_CLIENT 

-

139 

-

140 if not __S3_CLIENTS: 

-

141 verify = True 

-

142 if "ROK4_SSL_NO_VERIFY" in os.environ and os.environ["ROK4_SSL_NO_VERIFY"] != "": 

-

143 verify = False 

-

144 # C'est la première fois qu'on cherche à utiliser le stockage S3, chargeons les informations depuis les variables d'environnement 

-

145 try: 

-

146 keys = os.environ["ROK4_S3_KEY"].split(",") 

-

147 secret_keys = os.environ["ROK4_S3_SECRETKEY"].split(",") 

-

148 urls = os.environ["ROK4_S3_URL"].split(",") 

-

149 

-

150 if len(keys) != len(secret_keys) or len(keys) != len(urls): 

-

151 raise StorageError( 

-

152 "S3", 

-

153 "S3 informations in environment variables are inconsistent : same number of element in each list is required", 

-

154 ) 

-

155 

-

156 for i in range(len(keys)): 

-

157 h = re.sub("https?://", "", urls[i]) 

-

158 

-

159 if h in __S3_CLIENTS: 

-

160 raise StorageError("S3", "A S3 cluster is defined twice (based on URL)") 

-

161 

-

162 __S3_CLIENTS[h] = { 

-

163 "client": boto3.client( 

-

164 "s3", 

-

165 aws_access_key_id=keys[i], 

-

166 aws_secret_access_key=secret_keys[i], 

-

167 verify=verify, 

-

168 endpoint_url=urls[i], 

-

169 config=botocore.config.Config(tcp_keepalive=True, max_pool_connections=10), 

-

170 ), 

-

171 "key": keys[i], 

-

172 "secret_key": secret_keys[i], 

-

173 "url": urls[i], 

-

174 "host": h, 

-

175 "secure": urls[i].startswith("https://"), 

-

176 } 

-

177 

-

178 if i == 0: 

-

179 # Le premier cluster est celui par défaut 

-

180 __S3_DEFAULT_CLIENT = h 

-

181 

-

182 except KeyError as e: 

-

183 raise MissingEnvironmentError(e) 

-

184 except Exception as e: 

-

185 raise StorageError("S3", e) 

-

186 

-

187 try: 

-

188 host = bucket_name.split("@")[1] 

-

189 except IndexError: 

-

190 host = __S3_DEFAULT_CLIENT 

-

191 

-

192 bucket_name = bucket_name.split("@")[0] 

-

193 

-

194 if host not in __S3_CLIENTS: 

-

195 raise StorageError("S3", f"Unknown S3 cluster, according to host '{host}'") 

-

196 

-

197 return __S3_CLIENTS[host], bucket_name 

-

198 

-

199 

-

200def disconnect_s3_clients() -> None: 

-

201 """Clean S3 clients""" 

-

202 

-

203 global __S3_CLIENTS, __S3_DEFAULT_CLIENT 

-

204 __S3_CLIENTS = {} 

-

205 __S3_DEFAULT_CLIENT = None 

-

206 

-

207 

-

208def __get_ceph_ioctx(pool: str) -> "rados.Ioctx": 

-

209 """Get the CEPH IO context 

-

210 

-

211 Create it (client and context) if not already done 

-

212 

-

213 Args: 

-

214 pool (str): CEPH pool's name 

-

215 

-

216 Raises: 

-

217 MissingEnvironmentError: Missing CEPH storage informations 

-

218 StorageError: CEPH IO context configuration issue 

-

219 

-

220 Returns: 

-

221 rados.Ioctx: IO ceph context 

-

222 """ 

-

223 global __CEPH_CLIENT, __CEPH_IOCTXS 

-

224 

-

225 if __CEPH_CLIENT is None: 

-

226 try: 

-

227 __CEPH_CLIENT = rados.Rados( 

-

228 conffile=os.environ["ROK4_CEPH_CONFFILE"], 

-

229 clustername=os.environ["ROK4_CEPH_CLUSTERNAME"], 

-

230 name=os.environ["ROK4_CEPH_USERNAME"], 

-

231 ) 

-

232 

-

233 __CEPH_CLIENT.connect() 

-

234 

-

235 except KeyError as e: 

-

236 raise MissingEnvironmentError(e) 

-

237 except Exception as e: 

-

238 raise StorageError("CEPH", e) 

-

239 

-

240 if pool not in __CEPH_IOCTXS: 

-

241 try: 

-

242 __CEPH_IOCTXS[pool] = __CEPH_CLIENT.open_ioctx(pool) 

-

243 except Exception as e: 

-

244 raise StorageError("CEPH", e) 

-

245 

-

246 return __CEPH_IOCTXS[pool] 

-

247 

-

248 

-

249def disconnect_ceph_clients() -> None: 

-

250 """Clean CEPH clients""" 

-

251 global __CEPH_CLIENT, __CEPH_IOCTXS 

-

252 __CEPH_CLIENT = None 

-

253 __CEPH_IOCTXS = {} 

-

254 

-

255 

-

256def get_infos_from_path(path: str) -> Tuple[StorageType, str, str, str]: 

-

257 """Extract storage type, the unprefixed path, the container and the basename from path (Default: FILE storage) 

-

258 

-

259 For a FILE storage, the tray is the directory and the basename is the file name. 

-

260 

-

261 For an object storage (CEPH or S3), the tray is the bucket or the pool and the basename is the object name. 

-

262 For a S3 bucket, format can be <bucket name>@<cluster name> to use several clusters. Cluster name is the host (without protocol) 

-

263 

-

264 Args: 

-

265 path (str): path to analyse 

-

266 

-

267 Returns: 

-

268 Tuple[StorageType, str, str, str]: storage type, unprefixed path, the container and the basename 

-

269 """ 

-

270 

-

271 if path.startswith("s3://"): 

-

272 bucket_name, object_name = path[5:].split("/", 1) 

-

273 return StorageType.S3, path[5:], bucket_name, object_name 

-

274 elif path.startswith("ceph://"): 

-

275 pool_name, object_name = path[7:].split("/", 1) 

-

276 return StorageType.CEPH, path[7:], pool_name, object_name 

-

277 elif path.startswith("file://"): 

-

278 return StorageType.FILE, path[7:], os.path.dirname(path[7:]), os.path.basename(path[7:]) 

-

279 elif path.startswith("http://"): 

-

280 return StorageType.HTTP, path[7:], os.path.dirname(path[7:]), os.path.basename(path[7:]) 

-

281 elif path.startswith("https://"): 

-

282 return StorageType.HTTPS, path[8:], os.path.dirname(path[8:]), os.path.basename(path[8:]) 

-

283 else: 

-

284 return StorageType.FILE, path, os.path.dirname(path), os.path.basename(path) 

-

285 

-

286 

-

287def get_path_from_infos(storage_type: StorageType, *args) -> str: 

-

288 """Write full path from elements 

-

289 

-

290 Prefixed wih storage's type, elements are joined with a slash 

-

291 

-

292 Args: 

-

293 storage_type (StorageType): Storage's type for path 

-

294 

-

295 Returns: 

-

296 str: Full path 

-

297 """ 

-

298 return f"{storage_type.value}{os.path.join(*args)}" 

-

299 

-

300 

-

301def hash_file(path: str) -> str: 

-

302 """Process MD5 sum of the provided file 

-

303 

-

304 Args: 

-

305 path (str): path to file 

-

306 

-

307 Returns: 

-

308 str: hexadeimal MD5 sum 

-

309 """ 

-

310 

-

311 checker = hashlib.md5() 

-

312 

-

313 with open(path, "rb") as file: 

-

314 chunk = 0 

-

315 while chunk != b"": 

-

316 chunk = file.read(65536) 

-

317 checker.update(chunk) 

-

318 

-

319 return checker.hexdigest() 

-

320 

-

321 

-

322def get_data_str(path: str) -> str: 

-

323 """Load full data into a string 

-

324 

-

325 Args: 

-

326 path (str): path to data 

-

327 

-

328 Raises: 

-

329 MissingEnvironmentError: Missing object storage informations 

-

330 StorageError: Storage read issue 

-

331 FileNotFoundError: File or object does not exist 

-

332 NotImplementedError: Storage type not handled 

-

333 

-

334 Returns: 

-

335 str: Data content 

-

336 """ 

-

337 

-

338 return get_data_binary(path).decode("utf-8") 

-

339 

-

340 

-

341@lru_cache(maxsize=__LRU_SIZE) 

-

342def __get_cached_data_binary(path: str, ttl_hash: int, range: Tuple[int, int] = None) -> str: 

-

343 """Load data into a binary string, using a LRU cache 

-

344 

-

345 Args: 

-

346 path (str): path to data 

-

347 ttl_hash (int): time hash, to invalid cache 

-

348 range (Tuple[int, int], optional): offset and size, to make a partial read. Defaults to None. 

-

349 

-

350 Raises: 

-

351 MissingEnvironmentError: Missing object storage informations 

-

352 StorageError: Storage read issue 

-

353 FileNotFoundError: File or object does not exist 

-

354 NotImplementedError: Storage type not handled 

-

355 

-

356 Returns: 

-

357 str: Data binary content 

-

358 """ 

-

359 storage_type, path, tray_name, base_name = get_infos_from_path(path) 

-

360 

-

361 if storage_type == StorageType.S3: 

-

362 s3_client, bucket_name = __get_s3_client(tray_name) 

-

363 

-

364 try: 

-

365 if range is None: 

-

366 data = ( 

-

367 s3_client["client"] 

-

368 .get_object( 

-

369 Bucket=bucket_name, 

-

370 Key=base_name, 

-

371 )["Body"] 

-

372 .read() 

-

373 ) 

-

374 else: 

-

375 data = ( 

-

376 s3_client["client"] 

-

377 .get_object( 

-

378 Bucket=bucket_name, 

-

379 Key=base_name, 

-

380 Range=f"bytes={range[0]}-{range[0] + range[1] - 1}", 

-

381 )["Body"] 

-

382 .read() 

-

383 ) 

-

384 

-

385 except botocore.exceptions.ClientError as e: 

-

386 if e.response["Error"]["Code"] == "NoSuchKey": 

-

387 raise FileNotFoundError(f"{storage_type.value}{path}") 

-

388 else: 

-

389 raise StorageError("S3", e) 

-

390 

-

391 except Exception as e: 

-

392 raise StorageError("S3", e) 

-

393 

-

394 elif storage_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE: 

-

395 ioctx = __get_ceph_ioctx(tray_name) 

-

396 

-

397 try: 

-

398 if range is None: 

-

399 size, mtime = ioctx.stat(base_name) 

-

400 data = ioctx.read(base_name, size) 

-

401 else: 

-

402 data = ioctx.read(base_name, range[1], range[0]) 

-

403 

-

404 except rados.ObjectNotFound: 

-

405 raise FileNotFoundError(f"{storage_type.value}{path}") 

-

406 

-

407 except Exception as e: 

-

408 raise StorageError("CEPH", e) 

-

409 

-

410 elif storage_type == StorageType.FILE: 

-

411 try: 

-

412 f = open(path, "rb") 

-

413 if range is None: 

-

414 data = f.read() 

-

415 else: 

-

416 f.seek(range[0]) 

-

417 data = f.read(range[1]) 

-

418 

-

419 f.close() 

-

420 

-

421 except FileNotFoundError: 

-

422 raise FileNotFoundError(f"{storage_type.value}{path}") 

-

423 

-

424 except Exception as e: 

-

425 raise StorageError("FILE", e) 

-

426 

-

427 elif storage_type == StorageType.HTTP or storage_type == StorageType.HTTPS: 

-

428 if range is None: 

-

429 try: 

-

430 reponse = requests.get(f"{storage_type.value}{path}", stream=True) 

-

431 data = reponse.content 

-

432 if reponse.status_code == 404: 

-

433 raise FileNotFoundError(f"{storage_type.value}{path}") 

-

434 except Exception as e: 

-

435 raise StorageError(storage_type.name, e) 

-

436 else: 

-

437 raise NotImplementedError("Cannot get partial data for storage type HTTP(S)") 

-

438 

-

439 else: 

-

440 raise NotImplementedError(f"Cannot get data for storage type {storage_type.name}") 

-

441 

-

442 return data 

-

443 

-

444 

-

445def get_data_binary(path: str, range: Tuple[int, int] = None) -> str: 

-

446 """Load data into a binary string 

-

447 

-

448 This function uses a LRU cache, with a TTL of 5 minutes 

-

449 

-

450 Args: 

-

451 path (str): path to data 

-

452 range (Tuple[int, int], optional): offset and size, to make a partial read. Defaults to None. 

-

453 

-

454 Raises: 

-

455 MissingEnvironmentError: Missing object storage informations 

-

456 StorageError: Storage read issue 

-

457 FileNotFoundError: File or object does not exist 

-

458 NotImplementedError: Storage type not handled 

-

459 

-

460 Returns: 

-

461 str: Data binary content 

-

462 """ 

-

463 return __get_cached_data_binary(path, __get_ttl_hash(), range) 

-

464 

-

465 

-

466def put_data_str(data: str, path: str) -> None: 

-

467 """Store string data into a file or an object 

-

468 

-

469 UTF-8 encoding is used for bytes conversion 

-

470 

-

471 Args: 

-

472 data (str): data to write 

-

473 path (str): destination path, where to write data 

-

474 

-

475 Raises: 

-

476 MissingEnvironmentError: Missing object storage informations 

-

477 StorageError: Storage write issue 

-

478 NotImplementedError: Storage type not handled 

-

479 """ 

-

480 

-

481 storage_type, path, tray_name, base_name = get_infos_from_path(path) 

-

482 

-

483 if storage_type == StorageType.S3: 

-

484 s3_client, bucket_name = __get_s3_client(tray_name) 

-

485 

-

486 try: 

-

487 s3_client["client"].put_object( 

-

488 Body=data.encode("utf-8"), Bucket=bucket_name, Key=base_name 

-

489 ) 

-

490 except Exception as e: 

-

491 raise StorageError("S3", e) 

-

492 

-

493 elif storage_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE: 

-

494 ioctx = __get_ceph_ioctx(tray_name) 

-

495 

-

496 try: 

-

497 ioctx.write_full(base_name, data.encode("utf-8")) 

-

498 except Exception as e: 

-

499 raise StorageError("CEPH", e) 

-

500 

-

501 elif storage_type == StorageType.FILE: 

-

502 try: 

-

503 f = open(path, "w") 

-

504 f.write(data) 

-

505 f.close() 

-

506 except Exception as e: 

-

507 raise StorageError("FILE", e) 

-

508 

-

509 else: 

-

510 raise NotImplementedError(f"Cannot write data for storage type {storage_type.name}") 

-

511 

-

512 

-

513def get_size(path: str) -> int: 

-

514 """Get size of file or object 

-

515 

-

516 Args: 

-

517 path (str): path of file/object whom size is asked 

-

518 

-

519 Raises: 

-

520 MissingEnvironmentError: Missing object storage informations 

-

521 StorageError: Storage read issue 

-

522 NotImplementedError: Storage type not handled 

-

523 

-

524 Returns: 

-

525 int: file/object size, in bytes 

-

526 """ 

-

527 

-

528 storage_type, path, tray_name, base_name = get_infos_from_path(path) 

-

529 

-

530 if storage_type == StorageType.S3: 

-

531 s3_client, bucket_name = __get_s3_client(tray_name) 

-

532 

-

533 try: 

-

534 size = s3_client["client"].head_object(Bucket=bucket_name, Key=base_name)[ 

-

535 "ContentLength" 

-

536 ] 

-

537 return int(size) 

-

538 except Exception as e: 

-

539 raise StorageError("S3", e) 

-

540 

-

541 elif storage_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE: 

-

542 ioctx = __get_ceph_ioctx(tray_name) 

-

543 

-

544 try: 

-

545 size, mtime = ioctx.stat(base_name) 

-

546 return size 

-

547 except Exception as e: 

-

548 raise StorageError("CEPH", e) 

-

549 

-

550 elif storage_type == StorageType.FILE: 

-

551 try: 

-

552 file_stats = os.stat(path) 

-

553 return file_stats.st_size 

-

554 except Exception as e: 

-

555 raise StorageError("FILE", e) 

-

556 

-

557 elif storage_type == StorageType.HTTP or storage_type == StorageType.HTTPS: 

-

558 try: 

-

559 # Le stream=True permet de ne télécharger que le header initialement 

-

560 reponse = requests.get(storage_type.value + path, stream=True).headers["content-length"] 

-

561 return reponse 

-

562 except Exception as e: 

-

563 raise StorageError(storage_type.name, e) 

-

564 

-

565 else: 

-

566 raise NotImplementedError(f"Cannot get size for storage type {storage_type.name}") 

-

567 

-

568 

-

569def exists(path: str) -> bool: 

-

570 """Do the file or object exist ? 

-

571 

-

572 Args: 

-

573 path (str): path of file/object to test 

-

574 

-

575 Raises: 

-

576 MissingEnvironmentError: Missing object storage informations 

-

577 StorageError: Storage read issue 

-

578 NotImplementedError: Storage type not handled 

-

579 

-

580 Returns: 

-

581 bool: file/object existing status 

-

582 """ 

-

583 

-

584 storage_type, path, tray_name, base_name = get_infos_from_path(path) 

-

585 

-

586 if storage_type == StorageType.S3: 

-

587 s3_client, bucket_name = __get_s3_client(tray_name) 

-

588 

-

589 try: 

-

590 s3_client["client"].head_object(Bucket=bucket_name, Key=base_name) 

-

591 return True 

-

592 except botocore.exceptions.ClientError as e: 

-

593 if e.response["Error"]["Code"] == "404": 

-

594 return False 

-

595 else: 

-

596 raise StorageError("S3", e) 

-

597 

-

598 elif storage_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE: 

-

599 ioctx = __get_ceph_ioctx(tray_name) 

-

600 

-

601 try: 

-

602 ioctx.stat(base_name) 

-

603 return True 

-

604 except rados.ObjectNotFound: 

-

605 return False 

-

606 except Exception as e: 

-

607 raise StorageError("CEPH", e) 

-

608 

-

609 elif storage_type == StorageType.FILE: 

-

610 return os.path.exists(path) 

-

611 

-

612 elif storage_type == StorageType.HTTP or storage_type == StorageType.HTTPS: 

-

613 try: 

-

614 response = requests.get(storage_type.value + path, stream=True) 

-

615 if response.status_code == 200: 

-

616 return True 

-

617 else: 

-

618 return False 

-

619 except Exception as e: 

-

620 raise StorageError(storage_type.name, e) 

-

621 

-

622 else: 

-

623 raise NotImplementedError(f"Cannot test existence for storage type {storage_type.name}") 

-

624 

-

625 

-

626def remove(path: str) -> None: 

-

627 """Remove the file/object 

-

628 

-

629 Args: 

-

630 path (str): path of file/object to remove 

-

631 

-

632 Raises: 

-

633 MissingEnvironmentError: Missing object storage informations 

-

634 StorageError: Storage removal issue 

-

635 NotImplementedError: Storage type not handled 

-

636 """ 

-

637 storage_type, path, tray_name, base_name = get_infos_from_path(path) 

-

638 

-

639 if storage_type == StorageType.S3: 

-

640 s3_client, bucket_name = __get_s3_client(tray_name) 

-

641 

-

642 try: 

-

643 s3_client["client"].delete_object(Bucket=bucket_name, Key=base_name) 

-

644 except Exception as e: 

-

645 raise StorageError("S3", e) 

-

646 

-

647 elif storage_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE: 

-

648 ioctx = __get_ceph_ioctx(tray_name) 

-

649 

-

650 try: 

-

651 ioctx.remove_object(base_name) 

-

652 except rados.ObjectNotFound: 

-

653 pass 

-

654 except Exception as e: 

-

655 raise StorageError("CEPH", e) 

-

656 

-

657 elif storage_type == StorageType.FILE: 

-

658 try: 

-

659 os.remove(path) 

-

660 except FileNotFoundError: 

-

661 pass 

-

662 except Exception as e: 

-

663 raise StorageError("FILE", e) 

-

664 

-

665 else: 

-

666 raise NotImplementedError(f"Cannot remove data for storage type {storage_type.name}") 

-

667 

-

668 

-

669def copy(from_path: str, to_path: str, from_md5: str = None) -> None: 

-

670 """Copy a file or object to a file or object place. If MD5 sum is provided, it is compared to sum after the copy. 

-

671 

-

672 Args: 

-

673 from_path (str): source file/object path, to copy 

-

674 to_path (str): destination file/object path 

-

675 from_md5 (str, optional): MD5 sum, re-processed after copy and controlled. Defaults to None. 

-

676 

-

677 Raises: 

-

678 StorageError: Copy issue 

-

679 MissingEnvironmentError: Missing object storage informations 

-

680 NotImplementedError: Storage type not handled 

-

681 """ 

-

682 

-

683 from_type, from_path, from_tray, from_base_name = get_infos_from_path(from_path) 

-

684 to_type, to_path, to_tray, to_base_name = get_infos_from_path(to_path) 

-

685 

-

686 # Réalisation de la copie, selon les types de stockage 

-

687 if from_type == StorageType.FILE and to_type == StorageType.FILE: 

-

688 try: 

-

689 if to_tray != "": 

-

690 os.makedirs(to_tray, exist_ok=True) 

-

691 

-

692 copyfile(from_path, to_path) 

-

693 

-

694 if from_md5 is not None: 

-

695 to_md5 = hash_file(to_path) 

-

696 if to_md5 != from_md5: 

-

697 raise StorageError( 

-

698 "FILE", 

-

699 f"Invalid MD5 sum control for copy file {from_path} to {to_path} : {from_md5} != {to_md5}", 

-

700 ) 

-

701 

-

702 except Exception as e: 

-

703 raise StorageError("FILE", f"Cannot copy file {from_path} to {to_path} : {e}") 

-

704 

-

705 elif from_type == StorageType.S3 and to_type == StorageType.FILE: 

-

706 s3_client, from_bucket = __get_s3_client(from_tray) 

-

707 

-

708 try: 

-

709 if to_tray != "": 

-

710 os.makedirs(to_tray, exist_ok=True) 

-

711 

-

712 s3_client["client"].download_file(from_bucket, from_base_name, to_path) 

-

713 

-

714 if from_md5 is not None: 

-

715 to_md5 = hash_file(to_path) 

-

716 if to_md5 != from_md5: 

-

717 raise StorageError( 

-

718 "S3 and FILE", 

-

719 f"Invalid MD5 sum control for copy S3 object {from_path} to file {to_path} : {from_md5} != {to_md5}", 

-

720 ) 

-

721 

-

722 except Exception as e: 

-

723 raise StorageError( 

-

724 "S3 and FILE", f"Cannot copy S3 object {from_path} to file {to_path} : {e}" 

-

725 ) 

-

726 

-

727 elif from_type == StorageType.FILE and to_type == StorageType.S3: 

-

728 s3_client, to_bucket = __get_s3_client(to_tray) 

-

729 

-

730 try: 

-

731 s3_client["client"].upload_file(from_path, to_bucket, to_base_name) 

-

732 

-

733 if from_md5 is not None: 

-

734 to_md5 = ( 

-

735 s3_client["client"] 

-

736 .head_object(Bucket=to_bucket, Key=to_base_name)["ETag"] 

-

737 .strip('"') 

-

738 ) 

-

739 if to_md5 != from_md5: 

-

740 raise StorageError( 

-

741 "FILE and S3", 

-

742 f"Invalid MD5 sum control for copy file {from_path} to S3 object {to_path} : {from_md5} != {to_md5}", 

-

743 ) 

-

744 except Exception as e: 

-

745 raise StorageError( 

-

746 "FILE and S3", f"Cannot copy file {from_path} to S3 object {to_path} : {e}" 

-

747 ) 

-

748 

-

749 elif from_type == StorageType.S3 and to_type == StorageType.S3: 

-

750 from_s3_client, from_bucket = __get_s3_client(from_tray) 

-

751 to_s3_client, to_bucket = __get_s3_client(to_tray) 

-

752 

-

753 try: 

-

754 if to_s3_client["host"] == from_s3_client["host"]: 

-

755 to_s3_client["client"].copy( 

-

756 {"Bucket": from_bucket, "Key": from_base_name}, to_bucket, to_base_name 

-

757 ) 

-

758 else: 

-

759 with tempfile.NamedTemporaryFile("w+b") as f: 

-

760 from_s3_client["client"].download_fileobj(from_bucket, from_base_name, f) 

-

761 to_s3_client["client"].upload_file(f.name, to_bucket, to_base_name) 

-

762 

-

763 if from_md5 is not None: 

-

764 to_md5 = ( 

-

765 to_s3_client["client"] 

-

766 .head_object(Bucket=to_bucket, Key=to_base_name)["ETag"] 

-

767 .strip('"') 

-

768 ) 

-

769 if to_md5 != from_md5: 

-

770 raise StorageError( 

-

771 "S3", 

-

772 f"Invalid MD5 sum control for copy S3 object {from_path} to {to_path} : {from_md5} != {to_md5}", 

-

773 ) 

-

774 

-

775 except Exception as e: 

-

776 raise StorageError("S3", f"Cannot copy S3 object {from_path} to {to_path} : {e}") 

-

777 

-

778 elif from_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE and to_type == StorageType.FILE: 

-

779 ioctx = __get_ceph_ioctx(from_tray) 

-

780 

-

781 if from_md5 is not None: 

-

782 checker = hashlib.md5() 

-

783 

-

784 try: 

-

785 if to_tray != "": 

-

786 os.makedirs(to_tray, exist_ok=True) 

-

787 f = open(to_path, "wb") 

-

788 

-

789 offset = 0 

-

790 size = 0 

-

791 

-

792 while True: 

-

793 chunk = ioctx.read(from_base_name, 65536, offset) 

-

794 size = len(chunk) 

-

795 offset += size 

-

796 f.write(chunk) 

-

797 

-

798 if from_md5 is not None: 

-

799 checker.update(chunk) 

-

800 

-

801 if size < 65536: 

-

802 break 

-

803 

-

804 f.close() 

-

805 

-

806 if from_md5 is not None and from_md5 != checker.hexdigest(): 

-

807 raise StorageError( 

-

808 "CEPH and FILE", 

-

809 f"Invalid MD5 sum control for copy CEPH object {from_path} to file {to_path} : {from_md5} != {checker.hexdigest()}", 

-

810 ) 

-

811 

-

812 except Exception as e: 

-

813 raise StorageError( 

-

814 "CEPH and FILE", f"Cannot copy CEPH object {from_path} to file {to_path} : {e}" 

-

815 ) 

-

816 

-

817 elif from_type == StorageType.FILE and to_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE: 

-

818 ioctx = __get_ceph_ioctx(to_tray) 

-

819 

-

820 if from_md5 is not None: 

-

821 checker = hashlib.md5() 

-

822 

-

823 try: 

-

824 f = open(from_path, "rb") 

-

825 

-

826 offset = 0 

-

827 size = 0 

-

828 

-

829 while True: 

-

830 chunk = f.read(65536) 

-

831 size = len(chunk) 

-

832 ioctx.write(to_base_name, chunk, offset) 

-

833 offset += size 

-

834 

-

835 if from_md5 is not None: 

-

836 checker.update(chunk) 

-

837 

-

838 if size < 65536: 

-

839 break 

-

840 

-

841 f.close() 

-

842 

-

843 if from_md5 is not None and from_md5 != checker.hexdigest(): 

-

844 raise StorageError( 

-

845 "FILE and CEPH", 

-

846 f"Invalid MD5 sum control for copy file {from_path} to CEPH object {to_path} : {from_md5} != {checker.hexdigest()}", 

-

847 ) 

-

848 

-

849 except Exception as e: 

-

850 raise StorageError( 

-

851 "FILE and CEPH", f"Cannot copy file {from_path} to CEPH object {to_path} : {e}" 

-

852 ) 

-

853 

-

854 elif from_type == StorageType.CEPH and to_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE: 

-

855 from_ioctx = __get_ceph_ioctx(from_tray) 

-

856 to_ioctx = __get_ceph_ioctx(to_tray) 

-

857 

-

858 if from_md5 is not None: 

-

859 checker = hashlib.md5() 

-

860 

-

861 try: 

-

862 offset = 0 

-

863 size = 0 

-

864 

-

865 while True: 

-

866 chunk = from_ioctx.read(from_base_name, 65536, offset) 

-

867 size = len(chunk) 

-

868 to_ioctx.write(to_base_name, chunk, offset) 

-

869 offset += size 

-

870 

-

871 if from_md5 is not None: 

-

872 checker.update(chunk) 

-

873 

-

874 if size < 65536: 

-

875 break 

-

876 

-

877 if from_md5 is not None and from_md5 != checker.hexdigest(): 

-

878 raise StorageError( 

-

879 "FILE and CEPH", 

-

880 f"Invalid MD5 sum control for copy CEPH object {from_path} to {to_path} : {from_md5} != {checker.hexdigest()}", 

-

881 ) 

-

882 

-

883 except Exception as e: 

-

884 raise StorageError("CEPH", f"Cannot copy CEPH object {from_path} to {to_path} : {e}") 

-

885 

-

886 elif from_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE and to_type == StorageType.S3: 

-

887 from_ioctx = __get_ceph_ioctx(from_tray) 

-

888 

-

889 s3_client, to_bucket = __get_s3_client(to_tray) 

-

890 

-

891 if from_md5 is not None: 

-

892 checker = hashlib.md5() 

-

893 

-

894 try: 

-

895 offset = 0 

-

896 size = 0 

-

897 

-

898 with tempfile.NamedTemporaryFile("w+b", delete=False) as f: 

-

899 name_tmp = f.name 

-

900 while True: 

-

901 chunk = from_ioctx.read(from_base_name, 65536, offset) 

-

902 size = len(chunk) 

-

903 offset += size 

-

904 f.write(chunk) 

-

905 

-

906 if from_md5 is not None: 

-

907 checker.update(chunk) 

-

908 

-

909 if size < 65536: 

-

910 break 

-

911 

-

912 s3_client["client"].upload_file(name_tmp, to_bucket, to_base_name) 

-

913 

-

914 os.remove(name_tmp) 

-

915 

-

916 if from_md5 is not None and from_md5 != checker.hexdigest(): 

-

917 raise StorageError( 

-

918 "CEPH and S3", 

-

919 f"Invalid MD5 sum control for copy CEPH object {from_path} to S3 object {to_path} : {from_md5} != {checker.hexdigest()}", 

-

920 ) 

-

921 

-

922 except Exception as e: 

-

923 raise StorageError( 

-

924 "CEPH and S3", f"Cannot copy CEPH object {from_path} to S3 object {to_path} : {e}" 

-

925 ) 

-

926 

-

927 elif ( 

-

928 from_type == StorageType.HTTP or from_type == StorageType.HTTPS 

-

929 ) and to_type == StorageType.FILE: 

-

930 try: 

-

931 response = requests.get(from_type.value + from_path, stream=True) 

-

932 with open(to_path, "wb") as f: 

-

933 for chunk in response.iter_content(chunk_size=65536): 

-

934 if chunk: 

-

935 f.write(chunk) 

-

936 

-

937 except Exception as e: 

-

938 raise StorageError( 

-

939 "HTTP(S) and FILE", 

-

940 f"Cannot copy HTTP(S) object {from_path} to FILE object {to_path} : {e}", 

-

941 ) 

-

942 

-

943 elif ( 

-

944 (from_type == StorageType.HTTP or from_type == StorageType.HTTPS) 

-

945 and to_type == StorageType.CEPH 

-

946 and CEPH_RADOS_AVAILABLE 

-

947 ): 

-

948 to_ioctx = __get_ceph_ioctx(to_tray) 

-

949 

-

950 try: 

-

951 response = requests.get(from_type.value + from_path, stream=True) 

-

952 offset = 0 

-

953 for chunk in response.iter_content(chunk_size=65536): 

-

954 if chunk: 

-

955 size = len(chunk) 

-

956 to_ioctx.write(to_base_name, chunk, offset) 

-

957 offset += size 

-

958 

-

959 except Exception as e: 

-

960 raise StorageError( 

-

961 "HTTP(S) and CEPH", 

-

962 f"Cannot copy HTTP(S) object {from_path} to CEPH object {to_path} : {e}", 

-

963 ) 

-

964 

-

965 elif ( 

-

966 from_type == StorageType.HTTP or from_type == StorageType.HTTPS 

-

967 ) and to_type == StorageType.S3: 

-

968 to_s3_client, to_bucket = __get_s3_client(to_tray) 

-

969 

-

970 try: 

-

971 response = requests.get(from_type.value + from_path, stream=True) 

-

972 with tempfile.NamedTemporaryFile("w+b", delete=False) as f: 

-

973 name_fich = f.name 

-

974 for chunk in response.iter_content(chunk_size=65536): 

-

975 if chunk: 

-

976 f.write(chunk) 

-

977 

-

978 to_s3_client["client"].upload_file(name_fich, to_tray, to_base_name) 

-

979 

-

980 os.remove(name_fich) 

-

981 

-

982 except Exception as e: 

-

983 raise StorageError( 

-

984 "HTTP(S) and S3", 

-

985 f"Cannot copy HTTP(S) object {from_path} to S3 object {to_path} : {e}", 

-

986 ) 

-

987 

-

988 else: 

-

989 raise NotImplementedError( 

-

990 f"Cannot copy data from storage type {from_type.name} to storage type {to_type.name}" 

-

991 ) 

-

992 

-

993 

-

994def link(target_path: str, link_path: str, hard: bool = False) -> None: 

-

995 """Create a symbolic link 

-

996 

-

997 Args: 

-

998 target_path (str): file/object to link 

-

999 link_path (str): link to create 

-

1000 hard (bool, optional): hard link rather than symbolic. Only for FILE storage. Defaults to False. 

-

1001 

-

1002 Raises: 

-

1003 StorageError: link issue 

-

1004 MissingEnvironmentError: Missing object storage informations 

-

1005 NotImplementedError: Storage type not handled 

-

1006 """ 

-

1007 

-

1008 target_type, target_path, target_tray, target_base_name = get_infos_from_path(target_path) 

-

1009 link_type, link_path, link_tray, link_base_name = get_infos_from_path(link_path) 

-

1010 

-

1011 if target_type != link_type: 

-

1012 raise StorageError( 

-

1013 f"{target_type.name} and {link_type.name}", 

-

1014 "Cannot make link between two different storage types", 

-

1015 ) 

-

1016 

-

1017 if hard and target_type != StorageType.FILE: 

-

1018 raise StorageError(target_type.name, "Hard link is available only for FILE storage") 

-

1019 

-

1020 # Réalisation du lien, selon les types de stockage 

-

1021 if target_type == StorageType.S3: 

-

1022 target_s3_client, target_bucket = __get_s3_client(target_tray) 

-

1023 link_s3_client, link_bucket = __get_s3_client(link_tray) 

-

1024 

-

1025 if target_s3_client["host"] != link_s3_client["host"]: 

-

1026 raise StorageError( 

-

1027 "S3", 

-

1028 f"Cannot make link {link_path} -> {target_path} : link works only on the same S3 cluster", 

-

1029 ) 

-

1030 

-

1031 try: 

-

1032 target_s3_client["client"].put_object( 

-

1033 Body=f"{__OBJECT_SYMLINK_SIGNATURE}{target_bucket}/{target_base_name}".encode(), 

-

1034 Bucket=link_bucket, 

-

1035 Key=link_base_name, 

-

1036 ) 

-

1037 except Exception as e: 

-

1038 raise StorageError("S3", e) 

-

1039 

-

1040 elif target_type == StorageType.CEPH and CEPH_RADOS_AVAILABLE: 

-

1041 ioctx = __get_ceph_ioctx(link_tray) 

-

1042 

-

1043 try: 

-

1044 ioctx.write_full(link_base_name, f"{__OBJECT_SYMLINK_SIGNATURE}{target_path}".encode()) 

-

1045 except Exception as e: 

-

1046 raise StorageError("CEPH", e) 

-

1047 

-

1048 elif target_type == StorageType.FILE: 

-

1049 try: 

-

1050 to_tray = get_infos_from_path(link_path)[2] 

-

1051 if to_tray != "": 

-

1052 os.makedirs(to_tray, exist_ok=True) 

-

1053 

-

1054 if exists(link_path): 

-

1055 remove(link_path) 

-

1056 if hard: 

-

1057 os.link(target_path, link_path) 

-

1058 else: 

-

1059 os.symlink(target_path, link_path) 

-

1060 except Exception as e: 

-

1061 raise StorageError("FILE", e) 

-

1062 

-

1063 else: 

-

1064 raise NotImplementedError(f"Cannot make link for storage type {target_type.name}") 

-

1065 

-

1066 

-

1067def get_osgeo_path(path: str) -> str: 

-

1068 """Return GDAL/OGR Open compliant path and configure storage access 

-

1069 

-

1070 For a S3 input path, endpoint, access and secret keys are set and path is built with "/vsis3" root. 

-

1071 

-

1072 For a FILE input path, only storage prefix is removed 

-

1073 

-

1074 Args: 

-

1075 path (str): Source path 

-

1076 

-

1077 Raises: 

-

1078 NotImplementedError: Storage type not handled 

-

1079 

-

1080 Returns: 

-

1081 str: GDAL/OGR Open compliant path 

-

1082 """ 

-

1083 

-

1084 storage_type, unprefixed_path, tray_name, base_name = get_infos_from_path(path) 

-

1085 

-

1086 if storage_type == StorageType.S3 and GDAL_AVAILABLE: 

-

1087 s3_client, bucket_name = __get_s3_client(tray_name) 

-

1088 

-

1089 gdal.SetConfigOption("AWS_SECRET_ACCESS_KEY", s3_client["secret_key"]) 

-

1090 gdal.SetConfigOption("AWS_ACCESS_KEY_ID", s3_client["key"]) 

-

1091 gdal.SetConfigOption("AWS_S3_ENDPOINT", s3_client["host"]) 

-

1092 gdal.SetConfigOption("AWS_VIRTUAL_HOSTING", "FALSE") 

-

1093 if not s3_client["secure"]: 

-

1094 gdal.SetConfigOption("AWS_HTTPS", "NO") 

-

1095 

-

1096 return f"/vsis3/{bucket_name}/{base_name}" 

-

1097 

-

1098 elif storage_type == StorageType.FILE: 

-

1099 return unprefixed_path 

-

1100 

-

1101 else: 

-

1102 raise NotImplementedError(f"Cannot get a GDAL/OGR compliant path from {path}") 

-

1103 

-

1104 

-

1105def size_path(path: str) -> int: 

-

1106 """Return the size of the given path (or, for the CEPH, the sum of the size of each object of the .list) 

-

1107 

-

1108 Args: 

-

1109 path (str): Source path 

-

1110 

-

1111 Raises: 

-

1112 StorageError: Unhandled link or link issue 

-

1113 MissingEnvironmentError: Missing object storage informations 

-

1114 NotImplementedError: Storage type not handled 

-

1115 

-

1116 Returns: 

-

1117 int: size of the path 

-

1118 """ 

-

1119 storage_type, unprefixed_path, tray_name, base_name = get_infos_from_path(path) 

-

1120 

-

1121 if storage_type == StorageType.FILE: 

-

1122 try: 

-

1123 total = 0 

-

1124 with os.scandir(unprefixed_path) as it: 

-

1125 for entry in it: 

-

1126 if entry.is_file(): 

-

1127 total += entry.stat().st_size 

-

1128 elif entry.is_dir(): 

-

1129 total += size_path(entry.path) 

-

1130 

-

1131 except Exception as e: 

-

1132 raise StorageError("FILE", e) 

-

1133 

-

1134 elif storage_type == StorageType.S3: 

-

1135 s3_client, bucket_name = __get_s3_client(tray_name) 

-

1136 

-

1137 try: 

-

1138 paginator = s3_client["client"].get_paginator("list_objects_v2") 

-

1139 pages = paginator.paginate( 

-

1140 Bucket=bucket_name, 

-

1141 Prefix=base_name + "/", 

-

1142 PaginationConfig={ 

-

1143 "PageSize": 10000, 

-

1144 }, 

-

1145 ) 

-

1146 total = 0 

-

1147 for page in pages: 

-

1148 for key in page["Contents"]: 

-

1149 total += key["Size"] 

-

1150 

-

1151 except Exception as e: 

-

1152 raise StorageError("S3", e) 

-

1153 

-

1154 else: 

-

1155 raise NotImplementedError( 

-

1156 f"Cannot get prefix path size for storage type {storage_type.name}" 

-

1157 ) 

-

1158 

-

1159 return total 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_style_py.html b/2.2.2/tests/z_4cdda0aa429327c0_style_py.html deleted file mode 100644 index b756311..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_style_py.html +++ /dev/null @@ -1,639 +0,0 @@ - - - - - Coverage for src/rok4/style.py: 90% - - - - - -
-
-

- Coverage for src/rok4/style.py: - 90% -

- -

- 177 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1"""Provide classes to use a ROK4 style. 

-

2 

-

3The module contains the following classe: 

-

4 

-

5- `Style` - Style descriptor, to convert raster data 

-

6 

-

7Loading a style requires environment variables : 

-

8 

-

9- ROK4_STYLES_DIRECTORY 

-

10""" 

-

11 

-

12# -- IMPORTS -- 

-

13 

-

14# standard library 

-

15import json 

-

16import os 

-

17from json.decoder import JSONDecodeError 

-

18from typing import Dict, Tuple 

-

19 

-

20from rok4.enums import ColorFormat 

-

21 

-

22# package 

-

23from rok4.exceptions import FormatError, MissingAttributeError, MissingEnvironmentError 

-

24from rok4.storage import exists, get_data_str 

-

25 

-

26DEG_TO_RAD = 0.0174532925199432958 

-

27 

-

28 

-

29class Colour: 

-

30 """A palette's RGBA colour. 

-

31 

-

32 Attributes: 

-

33 value (float): Value to convert to RGBA 

-

34 red (int): Red value (from 0 to 255) 

-

35 green (int): Green value (from 0 to 255) 

-

36 blue (int): Blue value (from 0 to 255) 

-

37 alpha (int): Alpha value (from 0 to 255) 

-

38 """ 

-

39 

-

40 def __init__(self, palette: Dict, style: "Style") -> None: 

-

41 """Constructor method 

-

42 

-

43 Args: 

-

44 colour: Colour attributes, according to JSON structure 

-

45 style: Style object containing the palette's colour to create 

-

46 

-

47 Examples: 

-

48 

-

49 JSON colour section 

-

50 

-

51 { 

-

52 "value": 600, 

-

53 "red": 220, 

-

54 "green": 179, 

-

55 "blue": 99, 

-

56 "alpha": 255 

-

57 } 

-

58 

-

59 Raises: 

-

60 MissingAttributeError: Attribute is missing in the content 

-

61 Exception: Invalid colour's band 

-

62 """ 

-

63 

-

64 try: 

-

65 self.value = palette["value"] 

-

66 

-

67 self.red = palette["red"] 

-

68 if self.red < 0 or self.red > 255: 

-

69 raise Exception( 

-

70 f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)" 

-

71 ) 

-

72 self.green = palette["green"] 

-

73 if self.green < 0 or self.green > 255: 

-

74 raise Exception( 

-

75 f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)" 

-

76 ) 

-

77 self.blue = palette["blue"] 

-

78 if self.blue < 0 or self.blue > 255: 

-

79 raise Exception( 

-

80 f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)" 

-

81 ) 

-

82 self.alpha = palette["alpha"] 

-

83 if self.alpha < 0 or self.alpha > 255: 

-

84 raise Exception( 

-

85 f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)" 

-

86 ) 

-

87 

-

88 except KeyError as e: 

-

89 raise MissingAttributeError(style.path, f"palette.colours[].{e}") 

-

90 

-

91 except TypeError: 

-

92 raise Exception( 

-

93 f"In style '{style.path}', a palette colour band has an invalid value (integer between 0 and 255 expected)" 

-

94 ) 

-

95 

-

96 @property 

-

97 def rgba(self) -> Tuple[int]: 

-

98 return (self.red, self.green, self.blue, self.alpha) 

-

99 

-

100 @property 

-

101 def rgb(self) -> Tuple[int]: 

-

102 return (self.red, self.green, self.blue) 

-

103 

-

104 

-

105class Palette: 

-

106 """A style's RGBA palette. 

-

107 

-

108 Attributes: 

-

109 no_alpha (bool): Colour without alpha band 

-

110 rgb_continuous (bool): Continuous RGB values ? 

-

111 alpha_continuous (bool): Continuous alpha values ? 

-

112 colours (List[Colour]): Palette's colours, input values ascending 

-

113 """ 

-

114 

-

115 def __init__(self, palette: Dict, style: "Style") -> None: 

-

116 """Constructor method 

-

117 

-

118 Args: 

-

119 palette: Palette attributes, according to JSON structure 

-

120 style: Style object containing the palette to create 

-

121 

-

122 Examples: 

-

123 

-

124 JSON palette section 

-

125 

-

126 { 

-

127 "no_alpha": false, 

-

128 "rgb_continuous": true, 

-

129 "alpha_continuous": true, 

-

130 "colours": [ 

-

131 { "value": -99999, "red": 255, "green": 255, "blue": 255, "alpha": 0 }, 

-

132 { "value": -99998.1, "red": 255, "green": 255, "blue": 255, "alpha": 0 }, 

-

133 { "value": -99998.0, "red": 255, "green": 0, "blue": 255, "alpha": 255 }, 

-

134 { "value": -501, "red": 255, "green": 0, "blue": 255, "alpha": 255 }, 

-

135 { "value": -500, "red": 1, "green": 29, "blue": 148, "alpha": 255 }, 

-

136 { "value": -15, "red": 19, "green": 42, "blue": 255, "alpha": 255 }, 

-

137 { "value": 0, "red": 67, "green": 105, "blue": 227, "alpha": 255 }, 

-

138 { "value": 0.01, "red": 57, "green": 151, "blue": 105, "alpha": 255 }, 

-

139 { "value": 300, "red": 230, "green": 230, "blue": 128, "alpha": 255 }, 

-

140 { "value": 600, "red": 220, "green": 179, "blue": 99, "alpha": 255 }, 

-

141 { "value": 2000, "red": 162, "green": 100, "blue": 51, "alpha": 255 }, 

-

142 { "value": 2500, "red": 122, "green": 81, "blue": 40, "alpha": 255 }, 

-

143 { "value": 3000, "red": 255, "green": 255, "blue": 255, "alpha": 255 }, 

-

144 { "value": 9000, "red": 255, "green": 255, "blue": 255, "alpha": 255 }, 

-

145 { "value": 9001, "red": 255, "green": 255, "blue": 255, "alpha": 255 } 

-

146 ] 

-

147 } 

-

148 

-

149 Raises: 

-

150 MissingAttributeError: Attribute is missing in the content 

-

151 Exception: No colour in the palette or invalid colour 

-

152 """ 

-

153 

-

154 try: 

-

155 self.no_alpha = palette["no_alpha"] 

-

156 self.rgb_continuous = palette["rgb_continuous"] 

-

157 self.alpha_continuous = palette["alpha_continuous"] 

-

158 

-

159 self.colours = [] 

-

160 for colour in palette["colours"]: 

-

161 self.colours.append(Colour(colour, style)) 

-

162 if len(self.colours) >= 2 and self.colours[-1].value <= self.colours[-2].value: 

-

163 raise Exception( 

-

164 f"Style '{style.path}' palette colours hav eto be ordered input value ascending" 

-

165 ) 

-

166 

-

167 if len(self.colours) == 0: 

-

168 raise Exception(f"Style '{style.path}' palette has no colour") 

-

169 

-

170 except KeyError as e: 

-

171 raise MissingAttributeError(style.path, f"palette.{e}") 

-

172 

-

173 def convert(self, value: float) -> Tuple[int]: 

-

174 

-

175 # Les couleurs dans la palette sont rangées par valeur croissante 

-

176 # On commence par gérer les cas où la valeur est en dehors de la palette 

-

177 

-

178 if value <= self.colours[0].value: 

-

179 if self.no_alpha: 

-

180 return self.colours[0].rgb 

-

181 else: 

-

182 return self.colours[0].rgba 

-

183 

-

184 if value >= self.colours[-1].value: 

-

185 if self.no_alpha: 

-

186 return self.colours[-1].rgb 

-

187 else: 

-

188 return self.colours[-1].rgba 

-

189 

-

190 # On va maintenant chercher les deux couleurs entre lesquelles la valeur est 

-

191 for i in range(1, len(self.colours)): 

-

192 if self.colours[i].value < value: 

-

193 continue 

-

194 

-

195 # on est sur la première couleur de valeur supérieure 

-

196 colour_inf = self.colours[i - 1] 

-

197 colour_sup = self.colours[i] 

-

198 break 

-

199 

-

200 ratio = (value - colour_inf.value) / (colour_sup.value - colour_inf.value) 

-

201 if self.rgb_continuous: 

-

202 pixel = ( 

-

203 colour_inf.red + ratio * (colour_sup.red - colour_inf.red), 

-

204 colour_inf.green + ratio * (colour_sup.green - colour_inf.green), 

-

205 colour_inf.blue + ratio * (colour_sup.blue - colour_inf.blue), 

-

206 ) 

-

207 else: 

-

208 pixel = (colour_inf.red, colour_inf.green, colour_inf.blue) 

-

209 

-

210 if self.no_alpha: 

-

211 return pixel 

-

212 else: 

-

213 if self.alpha_continuous: 

-

214 return pixel + (colour_inf.alpha + ratio * (colour_sup.alpha - colour_inf.alpha),) 

-

215 else: 

-

216 return pixel + (colour_inf.alpha,) 

-

217 

-

218 

-

219class Slope: 

-

220 """A style's slope parameters. 

-

221 

-

222 Attributes: 

-

223 algo (str): Slope calculation algorithm chosen by the user ("H" for Horn) 

-

224 unit (str): Slope unit 

-

225 image_nodata (float): Nodata input value 

-

226 slope_nodata (float): Nodata slope value 

-

227 slope_max (float): Maximum value for the slope 

-

228 """ 

-

229 

-

230 def __init__(self, slope: Dict, style: "Style") -> None: 

-

231 """Constructor method 

-

232 

-

233 Args: 

-

234 slope: Slope attributes, according to JSON structure 

-

235 style: Style object containing the slope to create 

-

236 

-

237 Examples: 

-

238 

-

239 JSON pente section 

-

240 

-

241 { 

-

242 "algo": "H", 

-

243 "unit": "degree", 

-

244 "image_nodata": -99999, 

-

245 "slope_nodata": 91, 

-

246 "slope_max": 90 

-

247 } 

-

248 

-

249 Raises: 

-

250 MissingAttributeError: Attribute is missing in the content 

-

251 """ 

-

252 

-

253 try: 

-

254 self.algo = slope.get("algo", "H") 

-

255 self.unit = slope.get("unit", "degree") 

-

256 self.image_nodata = slope.get("image_nodata", -99999) 

-

257 self.slope_nodata = slope.get("slope_nodata", 0) 

-

258 self.slope_max = slope.get("slope_max", 90) 

-

259 except KeyError as e: 

-

260 raise MissingAttributeError(style.path, f"pente.{e}") 

-

261 

-

262 

-

263class Exposition: 

-

264 """A style's exposition parameters. 

-

265 

-

266 Attributes: 

-

267 algo (str): Slope calculation algorithm chosen by the user ("H" for Horn) 

-

268 min_slope (int): Slope from which exposition is computed 

-

269 image_nodata (float): Nodata input value 

-

270 exposition_nodata (float): Nodata exposition value 

-

271 """ 

-

272 

-

273 def __init__(self, exposition: Dict, style: "Style") -> None: 

-

274 """Constructor method 

-

275 

-

276 Args: 

-

277 exposition: Exposition attributes, according to JSON structure 

-

278 style: Style object containing the exposition to create 

-

279 

-

280 Examples: 

-

281 

-

282 JSON exposition section 

-

283 

-

284 { 

-

285 "algo": "H", 

-

286 "min_slope": 1 

-

287 } 

-

288 

-

289 Raises: 

-

290 MissingAttributeError: Attribute is missing in the content 

-

291 """ 

-

292 

-

293 try: 

-

294 self.algo = exposition.get("algo", "H") 

-

295 self.min_slope = exposition.get("min_slope", 1.0) * DEG_TO_RAD 

-

296 self.image_nodata = exposition.get("min_slope", -99999) 

-

297 self.exposition_nodata = exposition.get("aspect_nodata", -1) 

-

298 except KeyError as e: 

-

299 raise MissingAttributeError(style.path, f"exposition.{e}") 

-

300 

-

301 

-

302class Estompage: 

-

303 """A style's estompage parameters. 

-

304 

-

305 Attributes: 

-

306 zenith (float): Sun's zenith in degree 

-

307 azimuth (float): Sun's azimuth in degree 

-

308 z_factor (int): Slope exaggeration factor 

-

309 image_nodata (float): Nodata input value 

-

310 estompage_nodata (float): Nodata estompage value 

-

311 """ 

-

312 

-

313 def __init__(self, estompage: Dict, style: "Style") -> None: 

-

314 """Constructor method 

-

315 

-

316 Args: 

-

317 estompage: Estompage attributes, according to JSON structure 

-

318 style: Style object containing the estompage to create 

-

319 

-

320 Examples: 

-

321 

-

322 JSON estompage section 

-

323 

-

324 { 

-

325 "zenith": 45, 

-

326 "azimuth": 315, 

-

327 "z_factor": 1 

-

328 } 

-

329 

-

330 Raises: 

-

331 MissingAttributeError: Attribute is missing in the content 

-

332 """ 

-

333 

-

334 try: 

-

335 # azimuth et azimuth sont converti en leur complémentaire en radian 

-

336 self.zenith = (90.0 - estompage.get("zenith", 45)) * DEG_TO_RAD 

-

337 self.azimuth = (360.0 - estompage.get("azimuth", 315)) * DEG_TO_RAD 

-

338 self.z_factor = estompage.get("z_factor", 1) 

-

339 self.image_nodata = estompage.get("image_nodata", -99999.0) 

-

340 self.estompage_nodata = estompage.get("estompage_nodata", 0.0) 

-

341 except KeyError as e: 

-

342 raise MissingAttributeError(style.path, f"estompage.{e}") 

-

343 

-

344 

-

345class Legend: 

-

346 """A style's legend. 

-

347 

-

348 Attributes: 

-

349 format (str): Legend image's mime type 

-

350 url (str): Legend image's url 

-

351 height (int): Legend image's pixel height 

-

352 width (int): Legend image's pixel width 

-

353 min_scale_denominator (int): Minimum scale at which the legend is applicable 

-

354 max_scale_denominator (int): Maximum scale at which the legend is applicable 

-

355 """ 

-

356 

-

357 def __init__(self, legend: Dict, style: "Style") -> None: 

-

358 """Constructor method 

-

359 

-

360 Args: 

-

361 legend: Legend attributes, according to JSON structure 

-

362 style: Style object containing the legend to create 

-

363 

-

364 Examples: 

-

365 

-

366 JSON legend section 

-

367 

-

368 { 

-

369 "format": "image/png", 

-

370 "url": "http://ign.fr", 

-

371 "height": 100, 

-

372 "width": 100, 

-

373 "min_scale_denominator": 0, 

-

374 "max_scale_denominator": 30 

-

375 } 

-

376 

-

377 Raises: 

-

378 MissingAttributeError: Attribute is missing in the content 

-

379 """ 

-

380 

-

381 try: 

-

382 self.format = legend["format"] 

-

383 self.url = legend["url"] 

-

384 self.height = legend["height"] 

-

385 self.width = legend["width"] 

-

386 self.min_scale_denominator = legend["min_scale_denominator"] 

-

387 self.max_scale_denominator = legend["max_scale_denominator"] 

-

388 except KeyError as e: 

-

389 raise MissingAttributeError(style.path, f"legend.{e}") 

-

390 

-

391 

-

392class Style: 

-

393 """A raster data style 

-

394 

-

395 Attributes: 

-

396 path (str): TMS origin path (JSON) 

-

397 id (str): Style's technical identifier 

-

398 identifier (str): Style's public identifier 

-

399 title (str): Style's title 

-

400 abstract (str): Style's abstract 

-

401 keywords (List[str]): Style's keywords 

-

402 legend (Legend): Style's legend 

-

403 

-

404 palette (Palette): Style's palette, optionnal 

-

405 estompage (Estompage): Style's estompage parameters, optionnal 

-

406 slope (Slope): Style's slope parameters, optionnal 

-

407 exposition (Exposition): Style's exposition parameters, optionnal 

-

408 

-

409 """ 

-

410 

-

411 def __init__(self, id: str) -> None: 

-

412 """Constructor method 

-

413 

-

414 Style's directory is defined with environment variable ROK4_STYLES_DIRECTORY. Provided id is used as file/object name, with pr without JSON extension 

-

415 

-

416 Args: 

-

417 path: Style's id 

-

418 

-

419 Raises: 

-

420 MissingEnvironmentError: Missing object storage informations 

-

421 StorageError: Storage read issue 

-

422 FileNotFoundError: Style file or object does not exist, with or without extension 

-

423 FormatError: Provided path is not a well formed JSON 

-

424 MissingAttributeError: Attribute is missing in the content 

-

425 Exception: No colour in the palette or invalid colour 

-

426 """ 

-

427 

-

428 self.id = id 

-

429 

-

430 try: 

-

431 self.path = os.path.join(os.environ["ROK4_STYLES_DIRECTORY"], f"{self.id}") 

-

432 if not exists(self.path): 

-

433 self.path = os.path.join(os.environ["ROK4_STYLES_DIRECTORY"], f"{self.id}.json") 

-

434 if not exists(self.path): 

-

435 raise FileNotFoundError(f"{self.path}, even without extension") 

-

436 except KeyError as e: 

-

437 raise MissingEnvironmentError(e) 

-

438 

-

439 try: 

-

440 data = json.loads(get_data_str(self.path)) 

-

441 

-

442 self.identifier = data["identifier"] 

-

443 self.title = data["title"] 

-

444 self.abstract = data["abstract"] 

-

445 self.keywords = data["keywords"] 

-

446 

-

447 self.legend = Legend(data["legend"], self) 

-

448 

-

449 if "palette" in data: 

-

450 self.palette = Palette(data["palette"], self) 

-

451 else: 

-

452 self.palette = None 

-

453 

-

454 if "estompage" in data: 

-

455 self.estompage = Estompage(data["estompage"], self) 

-

456 else: 

-

457 self.estompage = None 

-

458 

-

459 if "pente" in data: 

-

460 self.slope = Slope(data["pente"], self) 

-

461 else: 

-

462 self.slope = None 

-

463 

-

464 if "exposition" in data: 

-

465 self.exposition = Exposition(data["exposition"], self) 

-

466 else: 

-

467 self.exposition = None 

-

468 

-

469 except JSONDecodeError as e: 

-

470 raise FormatError("JSON", self.path, e) 

-

471 

-

472 except KeyError as e: 

-

473 raise MissingAttributeError(self.path, e) 

-

474 

-

475 @property 

-

476 def bands(self) -> int: 

-

477 """Bands count after style application 

-

478 

-

479 Returns: 

-

480 int: Bands count after style application, None if style is identity 

-

481 """ 

-

482 if self.palette is not None: 

-

483 if self.palette.no_alpha: 

-

484 return 3 

-

485 else: 

-

486 return 4 

-

487 

-

488 elif self.estompage is not None or self.exposition is not None or self.slope is not None: 

-

489 return 1 

-

490 

-

491 else: 

-

492 return None 

-

493 

-

494 @property 

-

495 def format(self) -> ColorFormat: 

-

496 """Bands format after style application 

-

497 

-

498 Returns: 

-

499 ColorFormat: Bands format after style application, None if style is identity 

-

500 """ 

-

501 if self.palette is not None: 

-

502 return ColorFormat.UINT8 

-

503 

-

504 elif self.estompage is not None or self.exposition is not None or self.slope is not None: 

-

505 return ColorFormat.FLOAT32 

-

506 

-

507 else: 

-

508 return None 

-

509 

-

510 @property 

-

511 def input_nodata(self) -> float: 

-

512 """Input nodata value 

-

513 

-

514 Returns: 

-

515 float: Input nodata value, None if style is identity 

-

516 """ 

-

517 

-

518 if self.estompage is not None: 

-

519 return self.estompage.image_nodata 

-

520 elif self.exposition is not None: 

-

521 return self.exposition.image_nodata 

-

522 elif self.slope is not None: 

-

523 return self.slope.image_nodata 

-

524 elif self.palette is not None: 

-

525 return self.palette.colours[0].value 

-

526 else: 

-

527 return None 

-

528 

-

529 @property 

-

530 def is_identity(self) -> bool: 

-

531 """Is style identity 

-

532 

-

533 Returns: 

-

534 bool: Is style identity 

-

535 """ 

-

536 

-

537 return ( 

-

538 self.estompage is None 

-

539 and self.exposition is None 

-

540 and self.slope is None 

-

541 and self.palette is None 

-

542 ) 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_tile_matrix_set_py.html b/2.2.2/tests/z_4cdda0aa429327c0_tile_matrix_set_py.html deleted file mode 100644 index 897c250..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_tile_matrix_set_py.html +++ /dev/null @@ -1,365 +0,0 @@ - - - - - Coverage for src/rok4/tile_matrix_set.py: 96% - - - - - -
-
-

- Coverage for src/rok4/tile_matrix_set.py: - 96% -

- -

- 77 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1"""Provide classes to use a tile matrix set. 

-

2 

-

3The module contains the following classes: 

-

4 

-

5- `TileMatrixSet` - Multi level grid 

-

6- `TileMatrix` - A tile matrix set level 

-

7 

-

8Loading a tile matrix set requires environment variables : 

-

9 

-

10- ROK4_TMS_DIRECTORY 

-

11""" 

-

12 

-

13# -- IMPORTS -- 

-

14 

-

15# standard library 

-

16import json 

-

17import os 

-

18from json.decoder import JSONDecodeError 

-

19from typing import Dict, List, Tuple 

-

20 

-

21# package 

-

22from rok4.exceptions import FormatError, MissingAttributeError, MissingEnvironmentError 

-

23from rok4.storage import get_data_str 

-

24from rok4.utils import srs_to_spatialreference 

-

25 

-

26# -- GLOBALS -- 

-

27 

-

28 

-

29class TileMatrix: 

-

30 """A tile matrix is a tile matrix set's level. 

-

31 

-

32 Attributes: 

-

33 id (str): TM identifiant (no underscore). 

-

34 tms (TileMatrixSet): TMS to whom it belong 

-

35 resolution (float): Ground size of a pixel, using unity of the TMS's coordinate system. 

-

36 origin (Tuple[float, float]): X,Y coordinates of the upper left corner for the level, the grid's origin. 

-

37 tile_size (Tuple[int, int]): Pixel width and height of a tile. 

-

38 matrix_size (Tuple[int, int]): Number of tile in the level, widthwise and heightwise. 

-

39 """ 

-

40 

-

41 def __init__(self, level: Dict, tms: "TileMatrixSet") -> None: 

-

42 """Constructor method 

-

43 

-

44 Args: 

-

45 level: Level attributes, according to JSON structure 

-

46 tms: TMS object containing the level to create 

-

47 

-

48 Raises: 

-

49 MissingAttributeError: Attribute is missing in the content 

-

50 """ 

-

51 

-

52 self.tms = tms 

-

53 try: 

-

54 self.id = level["id"] 

-

55 if self.id.find("_") != -1: 

-

56 raise Exception( 

-

57 f"TMS {tms.path} owns a level whom id contains an underscore ({self.id})" 

-

58 ) 

-

59 self.resolution = level["cellSize"] 

-

60 self.origin = ( 

-

61 level["pointOfOrigin"][0], 

-

62 level["pointOfOrigin"][1], 

-

63 ) 

-

64 self.tile_size = ( 

-

65 level["tileWidth"], 

-

66 level["tileHeight"], 

-

67 ) 

-

68 self.matrix_size = ( 

-

69 level["matrixWidth"], 

-

70 level["matrixHeight"], 

-

71 ) 

-

72 self.__latlon = ( 

-

73 self.tms.sr.EPSGTreatsAsLatLong() or self.tms.sr.EPSGTreatsAsNorthingEasting() 

-

74 ) 

-

75 except KeyError as e: 

-

76 raise MissingAttributeError(tms.path, f"tileMatrices[].{e}") 

-

77 

-

78 def x_to_column(self, x: float) -> int: 

-

79 """Convert west-east coordinate to tile's column 

-

80 

-

81 Args: 

-

82 x (float): west-east coordinate (TMS coordinates system) 

-

83 

-

84 Returns: 

-

85 int: tile's column 

-

86 """ 

-

87 return int((x - self.origin[0]) / (self.resolution * self.tile_size[0])) 

-

88 

-

89 def y_to_row(self, y: float) -> int: 

-

90 """Convert north-south coordinate to tile's row 

-

91 

-

92 Args: 

-

93 y (float): north-south coordinate (TMS coordinates system) 

-

94 

-

95 Returns: 

-

96 int: tile's row 

-

97 """ 

-

98 return int((self.origin[1] - y) / (self.resolution * self.tile_size[1])) 

-

99 

-

100 def tile_to_bbox(self, tile_col: int, tile_row: int) -> Tuple[float, float, float, float]: 

-

101 """Get tile terrain extent (xmin, ymin, xmax, ymax), in TMS coordinates system 

-

102 

-

103 TMS spatial reference is Lat / Lon case is handled. 

-

104 

-

105 Args: 

-

106 tile_col (int): column indice 

-

107 tile_row (int): row indice 

-

108 

-

109 Returns: 

-

110 Tuple[float, float, float, float]: terrain extent (xmin, ymin, xmax, ymax) 

-

111 """ 

-

112 if self.__latlon: 

-

113 return ( 

-

114 self.origin[1] - self.resolution * (tile_row + 1) * self.tile_size[1], 

-

115 self.origin[0] + self.resolution * tile_col * self.tile_size[0], 

-

116 self.origin[1] - self.resolution * tile_row * self.tile_size[1], 

-

117 self.origin[0] + self.resolution * (tile_col + 1) * self.tile_size[0], 

-

118 ) 

-

119 else: 

-

120 return ( 

-

121 self.origin[0] + self.resolution * tile_col * self.tile_size[0], 

-

122 self.origin[1] - self.resolution * (tile_row + 1) * self.tile_size[1], 

-

123 self.origin[0] + self.resolution * (tile_col + 1) * self.tile_size[0], 

-

124 self.origin[1] - self.resolution * tile_row * self.tile_size[1], 

-

125 ) 

-

126 

-

127 def bbox_to_tiles(self, bbox: Tuple[float, float, float, float]) -> Tuple[int, int, int, int]: 

-

128 """Get extrems tile columns and rows corresponding to provided bounding box 

-

129 

-

130 TMS spatial reference is Lat / Lon case is handled. 

-

131 

-

132 Args: 

-

133 bbox (Tuple[float, float, float, float]): bounding box (xmin, ymin, xmax, ymax), in TMS coordinates system 

-

134 

-

135 Returns: 

-

136 Tuple[int, int, int, int]: extrem tiles (col_min, row_min, col_max, row_max) 

-

137 """ 

-

138 

-

139 if self.__latlon: 

-

140 return ( 

-

141 self.x_to_column(bbox[1]), 

-

142 self.y_to_row(bbox[2]), 

-

143 self.x_to_column(bbox[3]), 

-

144 self.y_to_row(bbox[0]), 

-

145 ) 

-

146 else: 

-

147 return ( 

-

148 self.x_to_column(bbox[0]), 

-

149 self.y_to_row(bbox[3]), 

-

150 self.x_to_column(bbox[2]), 

-

151 self.y_to_row(bbox[1]), 

-

152 ) 

-

153 

-

154 def point_to_indices(self, x: float, y: float) -> Tuple[int, int, int, int]: 

-

155 """Get pyramid's tile and pixel indices from point's coordinates 

-

156 

-

157 TMS spatial reference with Lat / Lon order is handled. 

-

158 

-

159 Args: 

-

160 x (float): point's x 

-

161 y (float): point's y 

-

162 

-

163 Returns: 

-

164 Tuple[int, int, int, int]: tile's column, tile's row, pixel's (in the tile) column, pixel's row 

-

165 """ 

-

166 

-

167 if self.__latlon: 

-

168 absolute_pixel_column = int((y - self.origin[0]) / self.resolution) 

-

169 absolute_pixel_row = int((self.origin[1] - x) / self.resolution) 

-

170 else: 

-

171 absolute_pixel_column = int((x - self.origin[0]) / self.resolution) 

-

172 absolute_pixel_row = int((self.origin[1] - y) / self.resolution) 

-

173 

-

174 return ( 

-

175 absolute_pixel_column // self.tile_size[0], 

-

176 absolute_pixel_row // self.tile_size[1], 

-

177 absolute_pixel_column % self.tile_size[0], 

-

178 absolute_pixel_row % self.tile_size[1], 

-

179 ) 

-

180 

-

181 @property 

-

182 def tile_width(self) -> int: 

-

183 return self.tile_size[0] 

-

184 

-

185 @property 

-

186 def tile_heigth(self) -> int: 

-

187 return self.tile_size[1] 

-

188 

-

189 

-

190class TileMatrixSet: 

-

191 """A tile matrix set is multi levels grid definition 

-

192 

-

193 Attributes: 

-

194 name (str): TMS's name 

-

195 path (str): TMS origin path (JSON) 

-

196 id (str): TMS identifier 

-

197 srs (str): TMS coordinates system 

-

198 sr (osgeo.osr.SpatialReference): TMS OSR spatial reference 

-

199 levels (Dict[str, TileMatrix]): TMS levels 

-

200 """ 

-

201 

-

202 def __init__(self, name: str) -> None: 

-

203 """Constructor method 

-

204 

-

205 Args: 

-

206 name: TMS's name 

-

207 

-

208 Raises: 

-

209 MissingEnvironmentError: Missing object storage informations 

-

210 Exception: No level in the TMS, CRS not recognized by OSR 

-

211 StorageError: Storage read issue 

-

212 FileNotFoundError: TMS file or object does not exist 

-

213 FormatError: Provided path is not a well formed JSON 

-

214 MissingAttributeError: Attribute is missing in the content 

-

215 """ 

-

216 

-

217 self.name = name 

-

218 

-

219 try: 

-

220 self.path = os.path.join(os.environ["ROK4_TMS_DIRECTORY"], f"{self.name}.json") 

-

221 except KeyError as e: 

-

222 raise MissingEnvironmentError(e) 

-

223 

-

224 try: 

-

225 data = json.loads(get_data_str(self.path)) 

-

226 

-

227 self.id = data["id"] 

-

228 self.srs = data["crs"] 

-

229 self.sr = srs_to_spatialreference(self.srs) 

-

230 self.levels = {} 

-

231 for level in data["tileMatrices"]: 

-

232 lev = TileMatrix(level, self) 

-

233 self.levels[lev.id] = lev 

-

234 

-

235 if len(self.levels.keys()) == 0: 

-

236 raise Exception(f"TMS '{self.path}' has no level") 

-

237 

-

238 if data["orderedAxes"] != ["X", "Y"] and data["orderedAxes"] != ["Lon", "Lat"]: 

-

239 raise Exception( 

-

240 f"TMS '{self.path}' own invalid axes order : only X/Y or Lon/Lat are handled" 

-

241 ) 

-

242 

-

243 except JSONDecodeError as e: 

-

244 raise FormatError("JSON", self.path, e) 

-

245 

-

246 except KeyError as e: 

-

247 raise MissingAttributeError(self.path, e) 

-

248 

-

249 except RuntimeError as e: 

-

250 raise Exception( 

-

251 f"Wrong attribute 'crs' ('{self.srs}') in '{self.path}', not recognize by OSR. Trace : {e}" 

-

252 ) 

-

253 

-

254 def get_level(self, level_id: str) -> "TileMatrix": 

-

255 """Get one level according to its identifier 

-

256 

-

257 Args: 

-

258 level_id: Level identifier 

-

259 

-

260 Returns: 

-

261 The corresponding tile matrix, None if not present 

-

262 """ 

-

263 

-

264 return self.levels.get(level_id, None) 

-

265 

-

266 @property 

-

267 def sorted_levels(self) -> List[TileMatrix]: 

-

268 return sorted(self.levels.values(), key=lambda level: level.resolution) 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_utils_py.html b/2.2.2/tests/z_4cdda0aa429327c0_utils_py.html deleted file mode 100644 index c67085c..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_utils_py.html +++ /dev/null @@ -1,361 +0,0 @@ - - - - - Coverage for src/rok4/utils.py: 97% - - - - - -
-
-

- Coverage for src/rok4/utils.py: - 97% -

- -

- 103 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1"""Provide functions to manipulate OGR / OSR entities 

-

2""" 

-

3 

-

4# -- IMPORTS -- 

-

5 

-

6# standard library 

-

7import os 

-

8import re 

-

9from typing import Tuple 

-

10 

-

11# 3rd party 

-

12from osgeo import gdal, ogr, osr 

-

13 

-

14# package 

-

15from rok4.enums import ColorFormat 

-

16 

-

17# -- GLOBALS -- 

-

18ogr.UseExceptions() 

-

19osr.UseExceptions() 

-

20gdal.UseExceptions() 

-

21 

-

22__SR_BOOK = {} 

-

23 

-

24 

-

25def srs_to_spatialreference(srs: str) -> "osgeo.osr.SpatialReference": 

-

26 """Convert coordinates system as string to OSR spatial reference 

-

27 

-

28 Using a cache, to instanciate a Spatial Reference from a string only once. 

-

29 

-

30 Args: 

-

31 srs (str): coordinates system PROJ4 compliant, with authority and code, like EPSG:3857 or IGNF:LAMB93 

-

32 

-

33 Raises: 

-

34 RuntimeError: Provided SRS is invalid for OSR 

-

35 

-

36 Returns: 

-

37 osgeo.osr.SpatialReference: Corresponding OSR spatial reference 

-

38 """ 

-

39 

-

40 global __SR_BOOK 

-

41 

-

42 if srs.upper() not in __SR_BOOK: 

-

43 authority, code = srs.split(":", 1) 

-

44 

-

45 sr = osr.SpatialReference() 

-

46 if authority.upper() == "EPSG": 

-

47 sr.ImportFromEPSG(int(code)) 

-

48 else: 

-

49 sr.ImportFromProj4(f"+init={srs.upper()} +wktext") 

-

50 

-

51 __SR_BOOK[srs.upper()] = sr 

-

52 

-

53 return __SR_BOOK[srs.upper()] 

-

54 

-

55 

-

56def bbox_to_geometry( 

-

57 bbox: Tuple[float, float, float, float], densification: int = 0 

-

58) -> "osgeo.ogr.Geometry": 

-

59 """Convert bbox coordinates to OGR geometry 

-

60 

-

61 Args: 

-

62 bbox (Tuple[float, float, float, float]): bounding box (xmin, ymin, xmax, ymax) 

-

63 densification (int, optional): Number of point to add for each side of bounding box. Defaults to 0. 

-

64 

-

65 Raises: 

-

66 RuntimeError: Provided SRS is invalid for OSR 

-

67 

-

68 Returns: 

-

69 osgeo.ogr.Geometry: Corresponding OGR geometry, with spatial reference if provided 

-

70 """ 

-

71 

-

72 ring = ogr.Geometry(ogr.wkbLinearRing) 

-

73 

-

74 if densification > 0: 

-

75 step_x = (bbox[2] - bbox[0]) / (densification + 1) 

-

76 step_y = (bbox[3] - bbox[1]) / (densification + 1) 

-

77 

-

78 for i in range(densification + 1): 

-

79 ring.AddPoint(bbox[0] + step_x * i, bbox[1]) 

-

80 for i in range(densification + 1): 

-

81 ring.AddPoint(bbox[2], bbox[1] + step_y * i) 

-

82 for i in range(densification + 1): 

-

83 ring.AddPoint(bbox[2] - step_x * i, bbox[3]) 

-

84 for i in range(densification + 1): 

-

85 ring.AddPoint(bbox[0], bbox[3] - step_y * i) 

-

86 ring.AddPoint(bbox[0], bbox[1]) 

-

87 

-

88 else: 

-

89 ring.AddPoint(bbox[0], bbox[1]) 

-

90 ring.AddPoint(bbox[2], bbox[1]) 

-

91 ring.AddPoint(bbox[2], bbox[3]) 

-

92 ring.AddPoint(bbox[0], bbox[3]) 

-

93 ring.AddPoint(bbox[0], bbox[1]) 

-

94 

-

95 geom = ogr.Geometry(ogr.wkbPolygon) 

-

96 geom.AddGeometry(ring) 

-

97 geom.SetCoordinateDimension(2) 

-

98 

-

99 return geom 

-

100 

-

101 

-

102def reproject_bbox( 

-

103 bbox: Tuple[float, float, float, float], srs_src: str, srs_dst: str, densification: int = 5 

-

104) -> Tuple[float, float, float, float]: 

-

105 """Return bounding box in other coordinates system 

-

106 

-

107 Points are added to be sure output bounding box contains input bounding box 

-

108 

-

109 Args: 

-

110 bbox (Tuple[float, float, float, float]): bounding box (xmin, ymin, xmax, ymax) with source coordinates system 

-

111 srs_src (str): source coordinates system 

-

112 srs_dst (str): destination coordinates system 

-

113 densification (int, optional): Number of point to add for each side of bounding box. Defaults to 5. 

-

114 

-

115 Returns: 

-

116 Tuple[float, float, float, float]: bounding box (xmin, ymin, xmax, ymax) with destination coordinates system 

-

117 """ 

-

118 

-

119 sr_src = srs_to_spatialreference(srs_src) 

-

120 sr_src_inv = sr_src.EPSGTreatsAsLatLong() or sr_src.EPSGTreatsAsNorthingEasting() 

-

121 

-

122 sr_dst = srs_to_spatialreference(srs_dst) 

-

123 sr_dst_inv = sr_dst.EPSGTreatsAsLatLong() or sr_dst.EPSGTreatsAsNorthingEasting() 

-

124 

-

125 if sr_src.IsSame(sr_dst) and sr_src_inv == sr_dst_inv: 

-

126 # Les système sont vraiment les même, avec le même ordre des axes 

-

127 return bbox 

-

128 elif sr_src.IsSame(sr_dst) and sr_src_inv != sr_dst_inv: 

-

129 # Les système sont les même pour OSR, mais l'ordre des axes est différent 

-

130 return (bbox[1], bbox[0], bbox[3], bbox[2]) 

-

131 

-

132 # Systèmes différents 

-

133 

-

134 bbox_src = bbox_to_geometry(bbox, densification) 

-

135 bbox_src.AssignSpatialReference(sr_src) 

-

136 

-

137 bbox_dst = bbox_src.Clone() 

-

138 os.environ["OGR_ENABLE_PARTIAL_REPROJECTION"] = "YES" 

-

139 bbox_dst.TransformTo(sr_dst) 

-

140 

-

141 env = bbox_dst.GetEnvelope() 

-

142 return (env[0], env[2], env[1], env[3]) 

-

143 

-

144 

-

145def reproject_point( 

-

146 point: Tuple[float, float], 

-

147 sr_src: "osgeo.osr.SpatialReference", 

-

148 sr_dst: "osgeo.osr.SpatialReference", 

-

149) -> Tuple[float, float]: 

-

150 """Reproject a point 

-

151 

-

152 Args: 

-

153 point (Tuple[float, float]): source spatial reference point 

-

154 sr_src (osgeo.osr.SpatialReference): source spatial reference 

-

155 sr_dst (osgeo.osr.SpatialReference): destination spatial reference 

-

156 

-

157 Returns: 

-

158 Tuple[float, float]: X/Y in destination spatial reference 

-

159 """ 

-

160 

-

161 sr_src_inv = sr_src.EPSGTreatsAsLatLong() or sr_src.EPSGTreatsAsNorthingEasting() 

-

162 sr_dst_inv = sr_dst.EPSGTreatsAsLatLong() or sr_dst.EPSGTreatsAsNorthingEasting() 

-

163 

-

164 if sr_src.IsSame(sr_dst) and sr_src_inv == sr_dst_inv: 

-

165 # Les système sont vraiment les mêmes, avec le même ordre des axes 

-

166 return (point[0], point[1]) 

-

167 elif sr_src.IsSame(sr_dst) and sr_src_inv != sr_dst_inv: 

-

168 # Les système sont les même pour OSR, mais l'ordre des axes est différent 

-

169 return (point[1], point[0]) 

-

170 

-

171 # Systèmes différents 

-

172 ct = osr.CreateCoordinateTransformation(sr_src, sr_dst) 

-

173 x_dst, y_dst, z_dst = ct.TransformPoint(point[0], point[1]) 

-

174 

-

175 return (x_dst, y_dst) 

-

176 

-

177 

-

178def compute_bbox(source_dataset: gdal.Dataset) -> Tuple: 

-

179 """Image boundingbox computing method 

-

180 

-

181 Args: 

-

182 source_dataset (gdal.Dataset): Dataset instanciated 

-

183 from the raster image 

-

184 

-

185 Limitations: 

-

186 Image's axis must be parallel to SRS' axis 

-

187 

-

188 Raises: 

-

189 AttributeError: source_dataset is not a gdal.Dataset instance. 

-

190 Exception: The dataset does not contain transform data. 

-

191 """ 

-

192 bbox = None 

-

193 transform_vector = source_dataset.GetGeoTransform() 

-

194 if transform_vector is None: 

-

195 raise Exception( 

-

196 "No transform vector found in the dataset created from " 

-

197 + f"the following file : {source_dataset.GetFileList()[0]}" 

-

198 ) 

-

199 width = source_dataset.RasterXSize 

-

200 height = source_dataset.RasterYSize 

-

201 x_range = ( 

-

202 transform_vector[0], 

-

203 transform_vector[0] + width * transform_vector[1] + height * transform_vector[2], 

-

204 ) 

-

205 y_range = ( 

-

206 transform_vector[3], 

-

207 transform_vector[3] + width * transform_vector[4] + height * transform_vector[5], 

-

208 ) 

-

209 spatial_ref = source_dataset.GetSpatialRef() 

-

210 if spatial_ref is not None and spatial_ref.GetDataAxisToSRSAxisMapping() == [2, 1]: 

-

211 # Coordonnées terrain de type (latitude, longitude) 

-

212 # => on permute les coordonnées terrain par rapport à l'image 

-

213 bbox = (min(y_range), min(x_range), max(y_range), max(x_range)) 

-

214 else: 

-

215 # Coordonnées terrain de type (longitude, latitude) ou pas de SRS 

-

216 # => les coordonnées terrain sont dans le même ordre que celle de l'image 

-

217 bbox = (min(x_range), min(y_range), max(x_range), max(y_range)) 

-

218 return bbox 

-

219 

-

220 

-

221def compute_format(dataset: gdal.Dataset, path: str = None) -> ColorFormat: 

-

222 """Image color format computing method 

-

223 

-

224 Args: 

-

225 dataset (gdal.Dataset): Dataset instanciated from the image 

-

226 path (str, optionnal): path to the original file/object 

-

227 

-

228 Raises: 

-

229 AttributeError: source_dataset is not a gdal.Dataset instance. 

-

230 Exception: No color band found or unsupported color format. 

-

231 """ 

-

232 color_format = None 

-

233 if path is None: 

-

234 path = dataset.GetFileList()[0] 

-

235 if dataset.RasterCount < 1: 

-

236 raise Exception(f"Image {path} contains no color band.") 

-

237 

-

238 band_1_datatype = dataset.GetRasterBand(1).DataType 

-

239 data_type_name = gdal.GetDataTypeName(band_1_datatype) 

-

240 data_type_size = gdal.GetDataTypeSize(band_1_datatype) 

-

241 color_interpretation = dataset.GetRasterBand(1).GetRasterColorInterpretation() 

-

242 color_name = None 

-

243 if color_interpretation is not None: 

-

244 color_name = gdal.GetColorInterpretationName(color_interpretation) 

-

245 compression_regex_match = re.search(r"COMPRESSION\s*=\s*PACKBITS", gdal.Info(dataset)) 

-

246 

-

247 if ( 

-

248 data_type_name == "Byte" 

-

249 and data_type_size == 8 

-

250 and color_name == "Palette" 

-

251 and compression_regex_match 

-

252 ): 

-

253 # Compris par libTIFF comme du noir et blanc sur 1 bit 

-

254 color_format = ColorFormat.BIT 

-

255 elif data_type_name == "Byte" and data_type_size == 8: 

-

256 color_format = ColorFormat.UINT8 

-

257 elif data_type_name == "Float32" and data_type_size == 32: 

-

258 color_format = ColorFormat.FLOAT32 

-

259 else: 

-

260 raise Exception( 

-

261 f"Unsupported color format for image {path} : " 

-

262 + f"'{data_type_name}' ({data_type_size} bits)" 

-

263 ) 

-

264 return color_format 

-
- - - diff --git a/2.2.2/tests/z_4cdda0aa429327c0_vector_py.html b/2.2.2/tests/z_4cdda0aa429327c0_vector_py.html deleted file mode 100644 index ce9b517..0000000 --- a/2.2.2/tests/z_4cdda0aa429327c0_vector_py.html +++ /dev/null @@ -1,323 +0,0 @@ - - - - - Coverage for src/rok4/vector.py: 88% - - - - - -
-
-

- Coverage for src/rok4/vector.py: - 88% -

- -

- 100 statements   - - - -

-

- « prev     - ^ index     - » next -       - coverage.py v7.6.1, - created at 2024-10-01 15:08 +0000 -

- -
-
-
-

1"""Provide class to read informations on vector data from file path or object path 

-

2 

-

3The module contains the following class : 

-

4 

-

5- `Vector` - Data Vector 

-

6""" 

-

7 

-

8# -- IMPORTS -- 

-

9 

-

10# standard library 

-

11import os 

-

12import tempfile 

-

13 

-

14# 3rd party 

-

15from osgeo import ogr 

-

16 

-

17# package 

-

18from rok4.storage import copy, get_osgeo_path 

-

19 

-

20# -- GLOBALS -- 

-

21 

-

22# Enable GDAL/OGR exceptions 

-

23ogr.UseExceptions() 

-

24 

-

25 

-

26class Vector: 

-

27 """A data vector 

-

28 

-

29 Attributes: 

-

30 path (str): path to the file/object 

-

31 bbox (Tuple[float, float, float, float]): bounding rectange in the data projection 

-

32 layers (List[Tuple[str, int, List[Tuple[str, str]]]]) : Vector layers with their name, their number of objects and their attributes 

-

33 """ 

-

34 

-

35 @classmethod 

-

36 def from_file(cls, path: str, **kwargs) -> "Vector": 

-

37 """Constructor method of a Vector from a file (Shapefile, Geopackage, CSV and GeoJSON) 

-

38 

-

39 Args: 

-

40 path (str): path to the file/object 

-

41 **csv (Dict[str : str]) : dictionnary of CSV parameters : 

-

42 -srs (str) ("EPSG:2154" if not provided) : spatial reference system of the geometry 

-

43 -column_x (str) ("x" if not provided) : field of the x coordinate 

-

44 -column_y (str) ("y" if not provided) : field of the y coordinate 

-

45 -column_wkt (str) (None if not provided) : field of the WKT of the geometry if WKT use to define coordinate 

-

46 

-

47 Examples: 

-

48 

-

49 from rok4.vector import Vector 

-

50 

-

51 try: 

-

52 vector = Vector.from_file("file://tests/fixtures/ARRONDISSEMENT.shp") 

-

53 vector_csv1 = Vector.from_file("file://tests/fixtures/vector.csv" , csv={"delimiter":";", "column_x":"x", "column_y":"y"}) 

-

54 vector_csv2 = Vector.from_file("file://tests/fixtures/vector2.csv" , csv={"delimiter":";", "column_wkt":"WKT"}) 

-

55 

-

56 except Exception as e: 

-

57 print(f"Vector creation raises an exception: {exc}") 

-

58 

-

59 Raises: 

-

60 MissingEnvironmentError: Missing object storage informations 

-

61 StorageError: Storage read issue 

-

62 Exception: Wrong column 

-

63 Exception: Wrong data in column 

-

64 Exception: Wrong format of file 

-

65 Exception: Wrong data in the file 

-

66 

-

67 """ 

-

68 

-

69 self = cls() 

-

70 

-

71 self.path = path 

-

72 

-

73 path_split = path.split("/") 

-

74 

-

75 if path_split[0] == "ceph:" or path.endswith(".csv"): 

-

76 if path.endswith(".shp"): 

-

77 with tempfile.TemporaryDirectory() as tmp: 

-

78 tmp_path = tmp + "/" + path_split[-1][:-4] 

-

79 

-

80 copy(path, "file://" + tmp_path + ".shp") 

-

81 copy(path[:-4] + ".shx", "file://" + tmp_path + ".shx") 

-

82 copy(path[:-4] + ".cpg", "file://" + tmp_path + ".cpg") 

-

83 copy(path[:-4] + ".dbf", "file://" + tmp_path + ".dbf") 

-

84 copy(path[:-4] + ".prj", "file://" + tmp_path + ".prj") 

-

85 

-

86 dataSource = ogr.Open(tmp_path + ".shp", 0) 

-

87 

-

88 elif path.endswith(".gpkg"): 

-

89 with tempfile.TemporaryDirectory() as tmp: 

-

90 tmp_path = tmp + "/" + path_split[-1][:-5] 

-

91 

-

92 copy(path, "file://" + tmp_path + ".gpkg") 

-

93 

-

94 dataSource = ogr.Open(tmp_path + ".gpkg", 0) 

-

95 

-

96 elif path.endswith(".geojson"): 

-

97 with tempfile.TemporaryDirectory() as tmp: 

-

98 tmp_path = tmp + "/" + path_split[-1][:-8] 

-

99 

-

100 copy(path, "file://" + tmp_path + ".geojson") 

-

101 

-

102 dataSource = ogr.Open(tmp_path + ".geojson", 0) 

-

103 

-

104 elif path.endswith(".csv"): 

-

105 # Récupération des informations optionnelles 

-

106 if "csv" in kwargs: 

-

107 csv = kwargs["csv"] 

-

108 else: 

-

109 csv = {} 

-

110 

-

111 if "srs" in csv and csv["srs"] is not None: 

-

112 srs = csv["srs"] 

-

113 else: 

-

114 srs = "EPSG:2154" 

-

115 

-

116 if "column_x" in csv and csv["column_x"] is not None: 

-

117 column_x = csv["column_x"] 

-

118 else: 

-

119 column_x = "x" 

-

120 

-

121 if "column_y" in csv and csv["column_y"] is not None: 

-

122 column_y = csv["column_y"] 

-

123 else: 

-

124 column_y = "y" 

-

125 

-

126 if "column_wkt" in csv: 

-

127 column_wkt = csv["column_wkt"] 

-

128 else: 

-

129 column_wkt = None 

-

130 

-

131 with tempfile.TemporaryDirectory() as tmp: 

-

132 tmp_path = tmp + "/" + path_split[-1][:-4] 

-

133 name_fich = path_split[-1][:-4] 

-

134 

-

135 copy(path, "file://" + tmp_path + ".csv") 

-

136 

-

137 with tempfile.NamedTemporaryFile( 

-

138 mode="w", suffix=".vrt", dir=tmp, delete=False 

-

139 ) as tmp2: 

-

140 vrt_file = "<OGRVRTDataSource>\n" 

-

141 vrt_file += '<OGRVRTLayer name="' + name_fich + '">\n' 

-

142 vrt_file += "<SrcDataSource>" + tmp_path + ".csv</SrcDataSource>\n" 

-

143 vrt_file += "<SrcLayer>" + name_fich + "</SrcLayer>\n" 

-

144 vrt_file += "<LayerSRS>" + srs + "</LayerSRS>\n" 

-

145 if column_wkt is None: 

-

146 vrt_file += ( 

-

147 '<GeometryField encoding="PointFromColumns" x="' 

-

148 + column_x 

-

149 + '" y="' 

-

150 + column_y 

-

151 + '"/>\n' 

-

152 ) 

-

153 else: 

-

154 vrt_file += ( 

-

155 '<GeometryField encoding="WKT" field="' + column_wkt + '"/>\n' 

-

156 ) 

-

157 vrt_file += "</OGRVRTLayer>\n" 

-

158 vrt_file += "</OGRVRTDataSource>" 

-

159 tmp2.write(vrt_file) 

-

160 dataSourceVRT = ogr.Open(tmp2.name, 0) 

-

161 os.remove(tmp2.name) 

-

162 dataSource = ogr.GetDriverByName("ESRI Shapefile").CopyDataSource( 

-

163 dataSourceVRT, tmp_path + "shp" 

-

164 ) 

-

165 

-

166 else: 

-

167 raise Exception("This format of file cannot be loaded") 

-

168 

-

169 else: 

-

170 dataSource = ogr.Open(get_osgeo_path(path), 0) 

-

171 

-

172 multipolygon = ogr.Geometry(ogr.wkbGeometryCollection) 

-

173 try: 

-

174 layer = dataSource.GetLayer() 

-

175 except AttributeError: 

-

176 raise Exception(f"The content of {self.path} cannot be read") 

-

177 

-

178 layers = [] 

-

179 for i in range(dataSource.GetLayerCount()): 

-

180 layer = dataSource.GetLayer(i) 

-

181 name = layer.GetName() 

-

182 count = layer.GetFeatureCount() 

-

183 layerDefinition = layer.GetLayerDefn() 

-

184 attributes = [] 

-

185 for j in range(layerDefinition.GetFieldCount()): 

-

186 fieldName = layerDefinition.GetFieldDefn(j).GetName() 

-

187 fieldTypeCode = layerDefinition.GetFieldDefn(j).GetType() 

-

188 fieldType = layerDefinition.GetFieldDefn(j).GetFieldTypeName(fieldTypeCode) 

-

189 attributes += [(fieldName, fieldType)] 

-

190 for feature in layer: 

-

191 geom = feature.GetGeometryRef() 

-

192 if geom is not None: 

-

193 multipolygon.AddGeometry(geom) 

-

194 layers += [(name, count, attributes)] 

-

195 

-

196 self.layers = layers 

-

197 self.bbox = multipolygon.GetEnvelope() 

-

198 

-

199 return self 

-

200 

-

201 @classmethod 

-

202 def from_parameters(cls, path: str, bbox: tuple, layers: list) -> "Vector": 

-

203 """Constructor method of a Vector from a parameters 

-

204 

-

205 Args: 

-

206 path (str): path to the file/object 

-

207 bbox (Tuple[float, float, float, float]): bounding rectange in the data projection 

-

208 layers (List[Tuple[str, int, List[Tuple[str, str]]]]) : Vector layers with their name, their number of objects and their attributes 

-

209 

-

210 Examples: 

-

211 

-

212 try : 

-

213 vector = Vector.from_parameters("file://tests/fixtures/ARRONDISSEMENT.shp", (1,2,3,4), [('ARRONDISSEMENT', 14, [('ID', 'String'), ('NOM', 'String'), ('INSEE_ARR', 'String'), ('INSEE_DEP', 'String'), ('INSEE_REG', 'String'), ('ID_AUT_ADM', 'String'), ('DATE_CREAT', 'String'), ('DATE_MAJ', 'String'), ('DATE_APP', 'Date'), ('DATE_CONF', 'Date')])]) 

-

214 

-

215 except Exception as e: 

-

216 print(f"Vector creation raises an exception: {exc}") 

-

217 

-

218 """ 

-

219 

-

220 self = cls() 

-

221 

-

222 self.path = path 

-

223 self.bbox = bbox 

-

224 self.layers = layers 

-

225 

-

226 return self 

-
- - - diff --git a/2.2.2/unit-tests/index.html b/2.2.2/unit-tests/index.html deleted file mode 100644 index f8f0422..0000000 --- a/2.2.2/unit-tests/index.html +++ /dev/null @@ -1,510 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - Rapport des tests unitaires - Projet ROK4 - Librairies python - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- - - - - - - - -
- - - - - - - -
- -
- - - - -
-
- - - - - - - - - - - - - - - -
-
- - - - -

Rapport des tests unitaires

- - - - - - - - - - - - - - - - - - - -
-
- - - -
- -
- - - -
-
-
-
- - - - - - - - - - \ No newline at end of file diff --git a/versions.json b/versions.json index 66f520c..c07886a 100644 --- a/versions.json +++ b/versions.json @@ -6,11 +6,6 @@ "latest" ] }, - { - "version": "2.2.2", - "title": "Version 2.2.2", - "aliases": [] - }, { "version": "2.1.5", "title": "Version 2.1.5",