diff --git a/docs/docs/sprint10.md b/docs/docs/sprint10.md index 83184cd..cfa6991 100644 --- a/docs/docs/sprint10.md +++ b/docs/docs/sprint10.md @@ -68,6 +68,8 @@ Burndown de riscos: ## Resultado da Revisão da Sprint +![Burndown Sprint 10](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint10/burndown.png) + ### Histórias entregues diff --git a/docs/docs/sprint11.md b/docs/docs/sprint11.md index 34a76cc..1c64e6f 100644 --- a/docs/docs/sprint11.md +++ b/docs/docs/sprint11.md @@ -18,7 +18,7 @@ id: sprint11 ## Objetivo - +O objetivo dessa sprint foi fechar as pendências da sprint passada, tendo em vista que nenhum dos membros conseguiu ser produtivo durante a sprint 10 ## Dados gerais @@ -67,14 +67,64 @@ Burndown de riscos: ## Resultado da Revisão da Sprint +![Burndown Sprint 11](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint11/burndown.png) + ### Histórias entregues +![Histórias Sprint 11](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint11/historias.png) + ### Dívidas entregues +Todas as histórias da sprint passada foi entregue nesse sprint. + ## Retrospectiva da Sprint +### O que estamos sentindo? + +#### Positivo +- O fim está próximo; +- A equipe ressuscitou essa semana; +- A equipe foi bem produtiva; +- Terminamos o Frontend. + +#### Negativo +- Indecisão do cliente quanto à tecnologia; +- Escolha de tecnologia falha; + + +### O que estamos fazendo? + +#### Positivo +- Voltamos a fazer nossos rituais de maneira consistente; +- Não estamos acatando qualquer opinião maluca do cliente; +- Equipe voltou a trabalhar e está animada. + +#### Negatvo + +### O que estamos ouvindo? + +#### Positivo +- Feedback da Professora foi muito bom; +- Opinião da professora sobre as decisões da equipe. + +#### Negativo +- Que a equipe não sabe pesquisar; +- Que a escolha de tecnologia da equipe foi falha(escolha não foi feita pela equipe); + + +### O que estamos vendo? + +#### Positivo +- O fim do projeto; +- Time de MDS está mais maduro(emocionalmente e tecnicamente). + +Negativo + + + + ### Possíveis melhorias diff --git a/docs/docs/sprint12.md b/docs/docs/sprint12.md new file mode 100644 index 0000000..eb3f5ee --- /dev/null +++ b/docs/docs/sprint12.md @@ -0,0 +1,110 @@ +--- +title: Sprint 12 +author: Matheus Joranhezon +authorURL: https://github.com/joranhezon +authorFBID: 100002504848674 +id: sprint12 +--- + +# Planejamento da Sprint + +| Membros presentes no planejamento da Sprint | +|---------------------| +| João Victor | +| Marcos Nery | +| Matheus Joranhezon | +| Rogerio | +| André | +| Kaique | +| Vinicius | + +## Objetivo + +O time inteiro ficou encarregado de encontrar uma solução para os problemas encontrados com a utilização da nova tecnologia, o GraphQL. Foi decidido que ao final da sprint os membros iriam apresentar as soluções e decidir qual seria mais viável para o projeto. Além disso, as requisições de devops do cliente foram alocadas para essa sprint. + +## Dados gerais + +**Data de início:** 03/11/2018 +**Data de término:** 10/11/2018 + +**Pontos Planejados:** 39 +**Pontos Adicionados:** 0 +**Pontos totais:** 0 + + +## Issues + +As issues podem ser encontradas [aqui](https://github.com/fga-eps-mds/2018.2-ComexStat/milestone/16). + + +### Dívidas alocadas + +A dívida dessa sprint é referente ao EVM. + +## Possíveis Riscos mapeados para a sprint + +Para essa sprint, os riscos encontrados foram: + +![Riscos Sprint 12](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint12/riscos.png) + +Nessa sprint, manteve-se o risco referente à produtividade da equipe, que persistirá dessa maneira até o fim do semestre pelo cansaço e pela pressão colocada nos membros pelas matérias e problemas pessoais. Além disso, o risco de tecnologia escalou consideravelmente, tendo em vista que isso poderia levar o projeto para um fim desastroso. + +Burndown de riscos: + +![Burndown de Riscos Sprint 12](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint12/burndownriscos.png) + + +# Resultados + + +## Resultado da Revisão da Sprint + +![Burndown Sprint 12](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint12/burndown.png) + + +### Histórias entregues + +![Histórias Sprint 12](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint12/historias.png) + + +### Dívidas entregues + + +## Retrospectiva da Sprint + +### O que estamos sentindo? + +#### Positivo +- Tá acabando; +- Orgulho de fechar a issue que está a 3 sprints de dívida. + +#### Negativo +- Agoniados para que acabe; +- Exaustão; +- Doenças; + +### O que estamos fazendo? + +#### Positivo +- Terminando o projeto; +- Fechando as issues apesar de exaustos: + +#### Negativo +- Se arrastando até o final; +- Reunião no fim de semana, de manhã, na chuva e no frio. + +### O que estamos ouvindo? + +#### Positivo +- Que está dando certo. + +#### Negativo +- Que a gente tá atrasado. + +### O que estamos pensando? +#### Positivo +- A gente superou tudo sem a Fabíola. + +#### Negativo +- Que queremos que acabe; +- Que não queremos vir pra reunião sábado; diff --git a/docs/docs/sprint13.md b/docs/docs/sprint13.md new file mode 100644 index 0000000..47951a0 --- /dev/null +++ b/docs/docs/sprint13.md @@ -0,0 +1,107 @@ +--- +title: Sprint 13 +author: Matheus Joranhezon +authorURL: https://github.com/joranhezon +authorFBID: 100002504848674 +id: sprint13 +--- + +# Planejamento da Sprint + +| Membros presentes no planejamento da Sprint | +|---------------------| +| Matheus Joranhezon | +| Sannya | +| Rogerio | +| Kaique | +| André | + +## Objetivo + +Realizar a integração da solução apresentada para o problema da tecnologia e fechar as dívidas de devops. + + + +## Dados gerais + +**Data de início:** 10/11/2018 +**Data de término:** 17/11/2018 + +**Pontos Planejados:** 28 +**Pontos Adicionados:** 26 +**Pontos totais:** 54 + + +## Pareamentos +- Sannya e Vinícius(Dívidas Devops) +- Rogério e Kaique ([Issue 192](https://github.com/fga-eps-mds/2018.2-comexstat/issues/192)) +- Marcos, André, João ([Issue 198](https://github.com/fga-eps-mds/2018.2-comexstat/issues/198) +- Matheus(Requisições do Cliente) + +## Issues + +As issues podem ser encontradas [aqui](https://github.com/fga-eps-mds/2018.2-ComexStat/milestone/17). + + +### Dívidas alocadas + +As dívidas referentes à essa sprint são em relação às issues de devops. + +## Possíveis Riscos mapeados para a sprint + +Para essa sprint, os riscos encontrados foram: + +![Riscos Sprint 13](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint13/riscos.png) + +Os riscos para essa sprint permaneceram os mesmos, exceto pelo risco da tecnologia, que caiu devido à solução encontrada na sprint 12. + +Burndown de riscos: + +![Burndown de Riscos Sprint 13](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint13/burndownriscos.png) + + +# Resultados + + +## Resultado da Revisão da Sprint + + +### Histórias entregues + +![Histórias Sprint 13](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint13/historias.png) + + +### Dívidas entregues + + +## Retrospectiva da Sprint + +### O que estamos sentindo? + +#### Positivo +- Que vai acabar o semestre. + +#### Negativo +- Ninguém aguenta mais. + +### O que estamos fazendo? + +#### Positivo +- Fechando as nossas pendências. + +#### Negativo + +### O que estamos ouvindo? + +#### Positivo +- Que tá acabando. + +#### Negativo +Que está todo mundo exausto(todos os grupos). + +### O que estamos vendo? + +#### Positivo +- O fechamento do projeto e suas pendências; + +#### Negativo diff --git a/docs/docs/sprint14.md b/docs/docs/sprint14.md new file mode 100644 index 0000000..07d527c --- /dev/null +++ b/docs/docs/sprint14.md @@ -0,0 +1,78 @@ +--- +title: Sprint 14 +author: Matheus Joranhezon +authorURL: https://github.com/joranhezon +authorFBID: 100002504848674 +id: sprint14 +--- + +# Planejamento da Sprint + +| Membros presentes no planejamento da Sprint | +|---------------------| +| Matheus Joranhezon | +| Sannya | +| Rogerio | +| Kaique | +| André | + +## Objetivo + +Realizar a integração da solução apresentada para o problema da tecnologia e fechar as dívidas de devops. + + + +## Dados gerais + +**Data de início:** 10/11/2018 +**Data de término:** 17/11/2018 + +**Pontos Planejados:** 28 +**Pontos Adicionados:** 26 +**Pontos totais:** 54 + + +## Pareamentos +- Sannya e Vinícius(Dívidas Devops) +- Rogério e Kaique ([Issue 192](https://github.com/fga-eps-mds/2018.2-comexstat/issues/192)) +- Marcos, André, João ([Issue 198](https://github.com/fga-eps-mds/2018.2-comexstat/issues/198) +- Matheus(Requisições do Cliente) + +## Issues + +As issues podem ser encontradas [aqui](https://github.com/fga-eps-mds/2018.2-ComexStat/milestone/17). + + +### Dívidas alocadas + +As dívidas referentes à essa sprint são em relação às issues de devops. + +## Possíveis Riscos mapeados para a sprint + +Para essa sprint, os riscos encontrados foram: + +![Riscos Sprint 13](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint13/riscos.png) + +Os riscos para essa sprint permaneceram os mesmos, exceto pelo risco da tecnologia, que caiu devido à solução encontrada na sprint 12. + +Burndown de riscos: + +![Burndown de Riscos Sprint 13](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint13/burndownriscos.png) + + +# Resultados + + +## Resultado da Revisão da Sprint + + +### Histórias entregues + + +### Dívidas entregues + + +## Retrospectiva da Sprint + + +### Possíveis melhorias diff --git a/docs/docs/sprint9.md b/docs/docs/sprint9.md index 3271a04..25e1e9c 100644 --- a/docs/docs/sprint9.md +++ b/docs/docs/sprint9.md @@ -67,10 +67,12 @@ Burndown de riscos: ## Resultado da Revisão da Sprint +![Burndown Sprint 9](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint9/burndown.png) ### Histórias entregues +![Histórias Sprint 9](https://fga-eps-mds.github.io/2018.2-ComexStat/img/sprints/sprint9/historias.png) ### Dívidas entregues diff --git a/docs/website/blog/2018-09-15-sprint-5.md b/docs/website/blog/2018-09-15-sprint-5.md index 9013155..428432b 100644 --- a/docs/website/blog/2018-09-15-sprint-5.md +++ b/docs/website/blog/2018-09-15-sprint-5.md @@ -15,7 +15,6 @@ authorFBID: 100002504848674 | João Victor | | Matheus Vitor | | Marcos Nery | -| Matheus Joranhezon | | Sannya | | Vinicius | | Kaique | diff --git a/docs/website/blog/2018-09-22-sprint-6.md b/docs/website/blog/2018-09-22-sprint-6.md index 75378e1..8052d86 100644 --- a/docs/website/blog/2018-09-22-sprint-6.md +++ b/docs/website/blog/2018-09-22-sprint-6.md @@ -12,7 +12,6 @@ authorFBID: 100002504848674 | André Lucas | | Fabíola Malta | | João Victor | -| Matheus Vitor | | Marcos Nery | | Matheus Joranhezon | | Sannya | diff --git a/docs/website/sidebars.json b/docs/website/sidebars.json index d4e818d..e3d3c45 100644 --- a/docs/website/sidebars.json +++ b/docs/website/sidebars.json @@ -5,6 +5,6 @@ "Capacitação": ["treinamentos", "microsservicos", "devOps", "pipelineDevOps", "utilizandoDockerDjango"], "Visão de Produto": ["docVisao", "termodeabertura", "eap", "canvas", "prototipo", "guiaestilo", "roadmap" ], "Arquitetura": ["definicaoArquitetura", "docArquitetura"], - "Sprints Release 2": ["sprint7", "sprint8", "sprint9", "sprint10", "sprint11"] + "Sprints Release 2": ["sprint7", "sprint8", "sprint9", "sprint10", "sprint11", "sprint12", "sprint13", "sprint14"] } } diff --git a/docs/website/static/img/sprints/sprint10/burndown.png b/docs/website/static/img/sprints/sprint10/burndown.png new file mode 100644 index 0000000..f2df253 Binary files /dev/null and b/docs/website/static/img/sprints/sprint10/burndown.png differ diff --git a/docs/website/static/img/sprints/sprint11/burndown.png b/docs/website/static/img/sprints/sprint11/burndown.png new file mode 100644 index 0000000..ebcd639 Binary files /dev/null and b/docs/website/static/img/sprints/sprint11/burndown.png differ diff --git a/docs/website/static/img/sprints/sprint11/historias.png b/docs/website/static/img/sprints/sprint11/historias.png new file mode 100644 index 0000000..5a03baa Binary files /dev/null and b/docs/website/static/img/sprints/sprint11/historias.png differ diff --git a/docs/website/static/img/sprints/sprint12/burndown.png b/docs/website/static/img/sprints/sprint12/burndown.png new file mode 100644 index 0000000..2936af5 Binary files /dev/null and b/docs/website/static/img/sprints/sprint12/burndown.png differ diff --git a/docs/website/static/img/sprints/sprint12/burndownriscos.png b/docs/website/static/img/sprints/sprint12/burndownriscos.png new file mode 100644 index 0000000..da2fb53 Binary files /dev/null and b/docs/website/static/img/sprints/sprint12/burndownriscos.png differ diff --git a/docs/website/static/img/sprints/sprint12/historias.png b/docs/website/static/img/sprints/sprint12/historias.png new file mode 100644 index 0000000..b5fd0b1 Binary files /dev/null and b/docs/website/static/img/sprints/sprint12/historias.png differ diff --git a/docs/website/static/img/sprints/sprint12/riscos.png b/docs/website/static/img/sprints/sprint12/riscos.png new file mode 100644 index 0000000..ddb15fa Binary files /dev/null and b/docs/website/static/img/sprints/sprint12/riscos.png differ diff --git a/docs/website/static/img/sprints/sprint13/burndownriscos.png b/docs/website/static/img/sprints/sprint13/burndownriscos.png new file mode 100644 index 0000000..0567e84 Binary files /dev/null and b/docs/website/static/img/sprints/sprint13/burndownriscos.png differ diff --git a/docs/website/static/img/sprints/sprint13/historias.png b/docs/website/static/img/sprints/sprint13/historias.png new file mode 100644 index 0000000..1c29852 Binary files /dev/null and b/docs/website/static/img/sprints/sprint13/historias.png differ diff --git a/docs/website/static/img/sprints/sprint13/riscos.png b/docs/website/static/img/sprints/sprint13/riscos.png new file mode 100644 index 0000000..c901950 Binary files /dev/null and b/docs/website/static/img/sprints/sprint13/riscos.png differ diff --git a/docs/website/static/img/sprints/sprint9/burndown.png b/docs/website/static/img/sprints/sprint9/burndown.png new file mode 100644 index 0000000..d436f28 Binary files /dev/null and b/docs/website/static/img/sprints/sprint9/burndown.png differ diff --git a/docs/website/static/img/sprints/sprint9/historias.png b/docs/website/static/img/sprints/sprint9/historias.png new file mode 100644 index 0000000..270030e Binary files /dev/null and b/docs/website/static/img/sprints/sprint9/historias.png differ diff --git a/src/comex_stat/assets/schema.py b/src/comex_stat/assets/schema.py index b8c5e3f..ba74794 100644 --- a/src/comex_stat/assets/schema.py +++ b/src/comex_stat/assets/schema.py @@ -1,18 +1,17 @@ -import json -from datetime import datetime, time - -import graphene from comex_stat.assets.models import (CGCE, CUCI, NCM, SH, AssetExportFacts, AssetImportFacts, Country, FederativeUnit, TradeBlocs, Transportation, Urf) - -from django_filters import FilterSet, CharFilter -from django.forms import DateField, Field -from django_filters.filters import RangeFilter -from django_filters.utils import handle_timezone from graphene_django.filter.fields import DjangoFilterConnectionField from graphene_django.types import DjangoObjectType +from django_filters.utils import handle_timezone +from django_filters import FilterSet, CharFilter +from django_filters.filters import RangeFilter +from django.forms import DateField, Field +from datetime import datetime, time +from django.db.models import Sum +import graphene +import json class DateRangeField(Field): @@ -320,12 +319,19 @@ class Meta: class AssetImportFactsNode(DjangoObjectType): + total_fob_value = graphene.String() + class Meta: model = AssetImportFacts filter_fields = ['commercialized_between', 'date', 'registries', 'net_kilogram', 'fob_value'] interfaces = (graphene.Node, ) + def resolve_total_fob_value(self, info): + a = AssetImportFacts.objects.filter( + date=self.date).aggregate(Sum('fob_value')) + return a['fob_value__sum'] + class AssetExportFactsNode(DjangoObjectType): class Meta: @@ -494,10 +500,228 @@ class Meta: interfaces = {graphene.Node, } +class Aggregated_Import(DjangoObjectType): + total_fob_value_country = graphene.String() + total_fob_value_transportation = graphene.String() + total_fob_value_date = graphene.String() + total_fob_value_urf = graphene.String() + total_fob_value_trade_bloc = graphene.String() + total_registries_country = graphene.String() + total_registries_transportation = graphene.String() + total_registries_date = graphene.String() + total_registries_urf = graphene.String() + total_registries_trade_bloc = graphene.String() + total_net_kilogram_country = graphene.String() + total_net_kilogram_transportation = graphene.String() + total_net_kilogram_date = graphene.String() + total_net_kilogram_urf = graphene.String() + total_net_kilogram_trade_bloc = graphene.String() + + class Meta: + model = AssetImportFacts + filter_fields = ['date', 'registries', 'net_kilogram', 'fob_value'] + interfaces = (graphene.Node, ) + + def resolve_total_fob_value_date(self, info): + a = AssetImportFacts.objects.filter(date=self.date).aggregate( + Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_fob_value_country(self, info): + a = AssetImportFacts.objects.filter( + origin_country__country_name_pt=self.origin_country. + country_name_pt).aggregate(Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_fob_value_transportation(self, info): + a = AssetImportFacts.objects.filter( + transportation__transportation_name=self.transportation. + transportation_name).aggregate(Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_fob_value_urf(self, info): + a = AssetImportFacts.objects.filter( + urf__urf_name=self.urf.urf_name).aggregate(Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_fob_value_trade_bloc(self, info): + a = AssetImportFacts.objects.filter( + origin_country__trade_bloc__bloc_name_pt=self.origin_country. + trade_bloc.bloc_name_pt).aggregate(Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_registries_date(self, info): + a = AssetImportFacts.objects.filter(date=self.date).aggregate( + Sum('registries')) + return a['registries__sum'] + + def resolve_total_registries_country(self, info): + a = AssetImportFacts.objects.filter( + origin_country__country_name_pt=self.origin_country. + country_name_pt).aggregate(Sum('registries')) + return a['registries__sum'] + + def resolve_total_registries_transportation(self, info): + a = AssetImportFacts.objects.filter( + transportation__transportation_name=self.transportation. + transportation_name).aggregate(Sum('registries')) + return a['registries__sum'] + + def resolve_total_registries_urf(self, info): + a = AssetImportFacts.objects.filter( + urf__urf_name=self.urf.urf_name).aggregate(Sum('registries')) + return a['registries__sum'] + + def resolve_total_registries_trade_bloc(self, info): + a = AssetImportFacts.objects.filter( + origin_country__trade_bloc__bloc_name_pt=self.origin_country. + trade_bloc.bloc_name_pt).aggregate(Sum('registries')) + return a['registries__sum'] + + def resolve_total_net_kilogram_date(self, info): + a = AssetImportFacts.objects.filter(date=self.date).aggregate( + Sum('net_kilogram')) + return a['net_kilogram__sum'] + + def resolve_total_net_kilogram_country(self, info): + a = AssetImportFacts.objects.filter( + origin_country__country_name_pt=self.origin_country. + country_name_pt).aggregate(Sum('net_kilogram')) + return a['net_kilogram__sum'] + + def resolve_total_net_kilogram_transportation(self, info): + a = AssetImportFacts.objects.filter( + transportation__transportation_name=self.transportation. + transportation_name).aggregate(Sum('net_kilogram')) + return a['net_kilogram__sum'] + + def resolve_total_net_kilogram_urf(self, info): + a = AssetImportFacts.objects.filter( + urf__urf_name=self.urf.urf_name).aggregate(Sum('net_kilogram')) + return a['net_kilogram__sum'] + + def resolve_total_net_kilogram_trade_bloc(self, info): + a = AssetImportFacts.objects.filter( + origin_country__trade_bloc__bloc_name_pt=self.origin_country. + trade_bloc.bloc_name_pt).aggregate(Sum('net_kilogram')) + return a['net_kilogram__sum'] + + +class Aggregated_Export(DjangoObjectType): + total_fob_value_country = graphene.String() + total_fob_value_transportation = graphene.String() + total_fob_value_date = graphene.String() + total_fob_value_urf = graphene.String() + total_fob_value_trade_bloc = graphene.String() + total_registries_country = graphene.String() + total_registries_transportation = graphene.String() + total_registries_date = graphene.String() + total_registries_urf = graphene.String() + total_registries_trade_bloc = graphene.String() + total_net_kilogram_country = graphene.String() + total_net_kilogram_transportation = graphene.String() + total_net_kilogram_date = graphene.String() + total_net_kilogram_urf = graphene.String() + total_net_kilogram_trade_bloc = graphene.String() + + class Meta: + model = AssetExportFacts + filter_fields = ['date', 'registries', 'net_kilogram', 'fob_value'] + interfaces = (graphene.Node, ) + + def resolve_total_fob_value_date(self, info): + a = AssetExportFacts.objects.filter(date=self.date).aggregate( + Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_fob_value_country(self, info): + a = AssetExportFacts.objects.filter( + destination_country__country_name_pt=self.destination_country. + country_name_pt).aggregate(Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_fob_value_transportation(self, info): + a = AssetExportFacts.objects.filter( + transportation__transportation_name=self.transportation. + transportation_name).aggregate(Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_fob_value_urf(self, info): + a = AssetExportFacts.objects.filter( + urf__urf_name=self.urf.urf_name).aggregate(Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_fob_value_trade_bloc(self, info): + a = AssetExportFacts.objects.filter( + destination_country__trade_bloc__bloc_name_pt=self. + destination_country.trade_bloc.bloc_name_pt).aggregate( + Sum('fob_value')) + return a['fob_value__sum'] + + def resolve_total_registries_date(self, info): + a = AssetExportFacts.objects.filter(date=self.date).aggregate( + Sum('registries')) + return a['registries__sum'] + + def resolve_total_registries_country(self, info): + a = AssetExportFacts.objects.filter( + destination_country__country_name_pt=self.destination_country. + country_name_pt).aggregate(Sum('registries')) + return a['registries__sum'] + + def resolve_total_registries_transportation(self, info): + a = AssetExportFacts.objects.filter( + transportation__transportation_name=self.transportation. + transportation_name).aggregate(Sum('registries')) + return a['registries__sum'] + + def resolve_total_registries_urf(self, info): + a = AssetExportFacts.objects.filter( + urf__urf_name=self.urf.urf_name).aggregate(Sum('registries')) + return a['registries__sum'] + + def resolve_total_registries_trade_bloc(self, info): + a = AssetExportFacts.objects.filter( + destination_country__trade_bloc__bloc_name_pt=self. + destination_country.trade_bloc.bloc_name_pt).aggregate( + Sum('registries')) + return a['registries__sum'] + + def resolve_total_net_kilogram_date(self, info): + a = AssetExportFacts.objects.filter(date=self.date).aggregate( + Sum('net_kilogram')) + return a['net_kilogram__sum'] + + def resolve_total_net_kilogram_country(self, info): + a = AssetExportFacts.objects.filter( + destination_country__country_name_pt=self.destination_country. + country_name_pt).aggregate(Sum('net_kilogram')) + return a['net_kilogram__sum'] + + def resolve_total_net_kilogram_transportation(self, info): + a = AssetExportFacts.objects.filter( + transportation__transportation_name=self.transportation. + transportation_name).aggregate(Sum('net_kilogram')) + return a['net_kilogram__sum'] + + def resolve_total_net_kilogram_urf(self, info): + a = AssetExportFacts.objects.filter( + urf__urf_name=self.urf.urf_name).aggregate(Sum('net_kilogram')) + return a['net_kilogram__sum'] + + def resolve_total_net_kilogram_trade_bloc(self, info): + a = AssetExportFacts.objects.filter( + destination_country__trade_bloc__bloc_name_pt=self. + destination_country.trade_bloc.bloc_name_pt).aggregate( + Sum('net_kilogram')) + return a['net_kilogram__sum'] + + class Query(graphene.ObjectType): all_import = DjangoFilterConnectionField( AssetImportFactsNode, filterset_class=AssetImportFilter, description="Método que retorna os objetos do tipo importação") all_export = DjangoFilterConnectionField( + AssetExportFactsNode, filterset_class=AssetExportFilter, description="Método que retorna os objetos do tipo exportação") all_tradeBlocs = DjangoFilterConnectionField(TradeBlocsType, description="Método que retorna os registros de blocos econômicos") all_country = DjangoFilterConnectionField(CountryType, description="Método que retorna os países registrados") @@ -508,6 +732,27 @@ class Query(graphene.ObjectType): all_cuci = DjangoFilterConnectionField(CUCIType, description="Método que retorna as nomenclaturas CUCI registradas") all_cgce = DjangoFilterConnectionField(CGCEType, description="Método que retorna as nomenclaturas CGCE registradas") all_sh = DjangoFilterConnectionField(SHType, description="Método que retorna as nomenclaturas SH registradas") + aggregated_import_transportation = DjangoFilterConnectionField( + Aggregated_Import, filterset_class=AssetImportFilter) + aggregated_import_urf = DjangoFilterConnectionField( + Aggregated_Import, filterset_class=AssetImportFilter) + aggregated_import_date = DjangoFilterConnectionField( + Aggregated_Import, filterset_class=AssetImportFilter) + aggregated_import_country = DjangoFilterConnectionField( + Aggregated_Import, filterset_class=AssetImportFilter) + aggregated_import_trade_bloc = DjangoFilterConnectionField( + Aggregated_Import, filterset_class=AssetImportFilter) + aggregated_export_transportation = DjangoFilterConnectionField( + Aggregated_Export, filterset_class=AssetExportFilter) + aggregated_export_urf = DjangoFilterConnectionField( + Aggregated_Export, filterset_class=AssetExportFilter) + aggregated_export_date = DjangoFilterConnectionField( + Aggregated_Export, filterset_class=AssetExportFilter) + aggregated_export_country = DjangoFilterConnectionField( + Aggregated_Export, filterset_class=AssetExportFilter) + aggregated_export_trade_bloc = DjangoFilterConnectionField( + Aggregated_Export, filterset_class=AssetExportFilter) + def resolve_all_import(self, info, **kwargs): return AssetImportFacts.objects.all() @@ -541,3 +786,77 @@ def resolve_all_cgce(self, info, **kwargs): def resolve_all_sh(self, info, **kwargs): return SH.objects.all() + + def resolve_aggregated_import_transportation(self, info, **kwargs): + return list(AssetImportFacts.objects.raw( + '''SELECT b.[id], a.[transportation_code], a.[transportation_name] + FROM assets_Transportation a INNER JOIN assets_AssetImportFacts b + ON a.[transportation_code]=b.[transportation_id] + GROUP BY a.[transportation_name]''')) + + def resolve_aggregated_import_urf(self, info, **kwargs): + return list(AssetImportFacts.objects.raw( + '''SELECT b.[id], a.[urf_code], a.[urf_name] + FROM assets_Urf a INNER JOIN assets_AssetImportFacts b + ON a.[urf_code]=b.[urf_id] + GROUP BY a.[urf_name]''')) + + def resolve_aggregated_import_date(self, info, **kwargs): + return list(AssetImportFacts.objects.raw('''Select id, COUNT(date) + FROM assets_AssetImportFacts + GROUP BY date''')) + + def resolve_aggregated_import_country(self, info, **kwargs): + return list(AssetImportFacts.objects.raw( + '''SELECT b.[id], a.[id], a.[country_name_pt] + FROM assets_Country a INNER JOIN assets_AssetImportFacts b + ON a.[id]=b.[origin_country_id] + GROUP BY a.[country_name_pt]''')) + + def resolve_aggregated_import_trade_bloc(self, info, **kwargs): + return list(AssetImportFacts.objects.raw( + '''SELECT c.[bloc_code], c.[bloc_name_pt], b.[origin_country_id], + a.[id], a.[trade_bloc_id] + FROM assets_AssetImportFacts b + INNER JOIN assets_Country a + ON a.[id]=b.[origin_country_id] + INNER JOIN assets_TradeBlocs c + ON c.[bloc_code]=a.[trade_bloc_id] + GROUP BY c.[bloc_name_pt]''')) + + def resolve_aggregated_export_transportation(self, info, **kwargs): + return list(AssetExportFacts.objects.raw( + '''SELECT b.[id], a.[transportation_code], a.[transportation_name] + FROM assets_Transportation a INNER JOIN assets_AssetExportFacts b + ON a.[transportation_code]=b.[transportation_id] + GROUP BY a.[transportation_name]''')) + + def resolve_aggregated_export_urf(self, info, **kwargs): + return list(AssetExportFacts.objects.raw( + '''SELECT b.[id], a.[urf_code], a.[urf_name] + FROM assets_Urf a INNER JOIN assets_AssetExportFacts b + ON a.[urf_code]=b.[urf_id] + GROUP BY a.[urf_name]''')) + + def resolve_aggregated_export_date(self, info, **kwargs): + return list(AssetExportFacts.objects.raw('''Select id, COUNT(date) + FROM assets_AssetExportFacts + GROUP BY date''')) + + def resolve_aggregated_export_country(self, info, **kwargs): + return list(AssetExportFacts.objects.raw( + '''SELECT b.[id], a.[id], a.[country_name_pt] + FROM assets_Country a INNER JOIN assets_AssetExportFacts b + ON a.[id]=b.[destination_country_id] + GROUP BY a.[country_name_pt]''')) + + def resolve_aggregated_export_trade_bloc(self, info, **kwargs): + return list(AssetExportFacts.objects.raw( + '''SELECT c.[bloc_code], c.[bloc_name_pt], b.[destination_country_id], + a.[id], a.[trade_bloc_id] + FROM assets_AssetExportFacts b + INNER JOIN assets_Country a + ON a.[id]=b.[destination_country_id] + INNER JOIN assets_TradeBlocs c + ON c.[bloc_code]=a.[trade_bloc_id] + GROUP BY c.[bloc_name_pt]''')) diff --git a/src/comex_stat/assets/tests.py b/src/comex_stat/assets/tests.py index c6abea4..c2a5460 100644 --- a/src/comex_stat/assets/tests.py +++ b/src/comex_stat/assets/tests.py @@ -6,6 +6,7 @@ TradeBlocs, Country, FederativeUnit, Transportation, Urf, NCM, CUCI, CGCE, SH) from django.core.exceptions import ValidationError +from collections import OrderedDict class SHTests(TestCase): @@ -1644,4 +1645,814 @@ def test_query_range_date_export(self): } schema = graphene.Schema(Query) result = schema.execute(query) - self.assertEqual(expected, result.data) \ No newline at end of file + self.assertEqual(expected, result.data) + + def test_import_aggregated_fob_value_country(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalFobValueCountry + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueCountry', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_fob_value_date(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalFobValueDate + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueDate', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_fob_value_transportation(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalFobValueTransportation + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueTransportation', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_fob_value_urf(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalFobValueUrf + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueUrf', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_fob_value_trade_bloc(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalFobValueTradeBloc + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueTradeBloc', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_registries_country(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalRegistriesCountry + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesCountry', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_registries_date(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalRegistriesDate + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesDate', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_registries_transportation(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalRegistriesTransportation + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesTransportation', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_registries_urf(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalRegistriesUrf + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesUrf', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_registries_trade_bloc(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalRegistriesTradeBloc + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesTradeBloc', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_net_kilogram_date(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalNetKilogramDate + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramDate', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_net_kilogram_country(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalNetKilogramCountry + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramCountry', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_net_kilogram_transportation(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalNetKilogramTransportation + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramTransportation', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_net_kilogram_urf(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalNetKilogramUrf + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramUrf', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_import_aggregated_net_kilogram_trade_bloc(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedImportCountry{ + edges{ + node{ + originCountry{ + countryNamePt + } + totalNetKilogramTradeBloc + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedImportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('originCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramTradeBloc', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_fob_value_country(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalFobValueCountry + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueCountry', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_fob_value_date(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalFobValueDate + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueDate', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_fob_value_transportation(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalFobValueTransportation + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueTransportation', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_fob_value_urf(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalFobValueUrf + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueUrf', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_fob_value_trade_bloc(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalFobValueTradeBloc + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalFobValueTradeBloc', '191.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_registries_country(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalRegistriesCountry + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesCountry', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_registries_date(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalRegistriesDate + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesDate', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_registries_transportation(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalRegistriesTransportation + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesTransportation', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_registries_urf(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalRegistriesUrf + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesUrf', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_registries_trade_bloc(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalRegistriesTradeBloc + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalRegistriesTradeBloc', '1')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_net_kilogram_date(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalNetKilogramDate + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramDate', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_net_kilogram_country(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalNetKilogramCountry + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramCountry', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_net_kilogram_transportation(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalNetKilogramTransportation + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramTransportation', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_net_kilogram_urf(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalNetKilogramUrf + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramUrf', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) + + def test_export_aggregated_net_kilogram_trade_bloc(self): + ''' + This test verifies the database search result + for the aggregated searches + ''' + + query = ''' + { + aggregatedExportCountry{ + edges{ + node{ + destinationCountry{ + countryNamePt + } + totalNetKilogramTradeBloc + } + } + } + } + ''' + + expected = OrderedDict([('aggregatedExportCountry', OrderedDict([('edges', [OrderedDict([('node', OrderedDict([('destinationCountry', OrderedDict([('countryNamePt', 'Nome')])), ('totalNetKilogramTradeBloc', '1.0')]))])])]))]) + + schema = graphene.Schema(Query) + result = schema.execute(query) + self.assertEqual(expected, result.data) \ No newline at end of file