Skip to content

Commit

Permalink
build: reformat all files
Browse files Browse the repository at this point in the history
  • Loading branch information
davinov committed Oct 6, 2020
1 parent 8db6147 commit 5daa436
Show file tree
Hide file tree
Showing 30 changed files with 47 additions and 50 deletions.
2 changes: 1 addition & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,4 @@ exclude_lines =
pragma: no cover

# Don't cover NotImplemented methods (specially useful for abstract classes)
raise NotImplementedError
raise NotImplementedError
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ jobs:

- name: Install psql-odbc dependencies
run: sudo bash toucan_connectors/install_scripts/psql.sh

- name: install
run: make install

Expand Down
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,22 +33,22 @@ or [MacOS](https://docs.microsoft.com/en-us/sql/connect/odbc/linux-mac/install-m
You can then install the library with `env LDFLAGS='-L/usr/local/lib -L/usr/local/opt/openssl/lib -L/usr/local/opt/readline/lib' pip install psycopg2`

## Testing a connector
If you want to run the tests for another connector, you can install the extra dependencies
(e.g to test MySQL just type `pip install -e ".[mysql]"`)
If you want to run the tests for another connector, you can install the extra dependencies
(e.g to test MySQL just type `pip install -e ".[mysql]"`)
Now `pytest tests/mysql` should run all the mysql tests properly.

If you want to run the tests for all the connectors you can add all the dependencies by typing
If you want to run the tests for all the connectors you can add all the dependencies by typing
`pip install -e ".[all]"` and `make test`.

## Adding a connector

To generate the connector and test modules from boilerplate, run:
To generate the connector and test modules from boilerplate, run:

```
$ make new_connector type=mytype
```

`mytype` should be the name of a system we would like to build a connector for,
`mytype` should be the name of a system we would like to build a connector for,
such as `MySQL` or `Hive` or `Magento`.

#### Step 1 : Tests
Expand Down Expand Up @@ -81,7 +81,7 @@ from toucan_connectors.toucan_connector import ToucanConnector, ToucanDataSource
class MyTypeDataSource(ToucanDataSource):
"""Model of my datasource"""
query: str


class MyTypeConnector(ToucanConnector):
"""Model of my connector"""
Expand All @@ -90,13 +90,13 @@ class MyTypeConnector(ToucanConnector):
host: str
port: int
database: str

def _retrieve_data(self, data_source: MyTypeDataSource) -> pd.DataFrame:
"""how to retrieve a dataframe"""
```

Please add your connector in `toucan_connectors/__init__.py`.
The key is what we call the `type` of the connector, which
The key is what we call the `type` of the connector, which
is basically like an id used to retrieve it.
```python
CONNECTORS_CATALOGUE = {
Expand Down
1 change: 0 additions & 1 deletion doc/connectors/ROK.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,3 @@ DATA_SOURCES: [
...
]
```

2 changes: 1 addition & 1 deletion doc/connectors/azure_mssql.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,4 +41,4 @@ DATA_SOURCES: [
,
...
]
```
```
2 changes: 1 addition & 1 deletion doc/connectors/dataiku.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,4 +38,4 @@ DATA_SOURCES: [
,
...
]
```
```
2 changes: 1 addition & 1 deletion doc/connectors/elasticsearch.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,4 +54,4 @@ DATA_SOURCES: [
Data will correspond to the field `_source` of the API response.
See API documentation for :
* _search : https://www.elastic.co/guide/en/elasticsearch/reference/current/search-search.html
* _msearch : https://www.elastic.co/guide/en/elasticsearch/reference/current/search-multi-search.html
* _msearch : https://www.elastic.co/guide/en/elasticsearch/reference/current/search-multi-search.html
2 changes: 1 addition & 1 deletion doc/connectors/google_cloud_mysql.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,4 +45,4 @@ DATA_SOURCES: [
,
...
]
```
```
4 changes: 2 additions & 2 deletions doc/connectors/google_my_business.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,8 +60,8 @@ DATA_SOURCES: [

## Get credentials

First, you will need a valid `client_secret.json` file. You can download it from your Google Cloud Platform Console
in `API & Services` > `Credentials` > `OAuth 2.0 client IDs.`
First, you will need a valid `client_secret.json` file. You can download it from your Google Cloud Platform Console
in `API & Services` > `Credentials` > `OAuth 2.0 client IDs.`

Then, in a virtualenv with `google_auth_oauthlib` and `google-api-python-client` package, you can use this python code to get your credentials:

Expand Down
6 changes: 3 additions & 3 deletions doc/connectors/google_spreadsheet.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@
Unless the spreadsheet is public, you will have to manually share it.

Open the google spreadsheet inside your web browser. Inside the File menu, there a
Share option. Click on it and enter the email address of your service account.
Share option. Click on it and enter the email address of your service account.

If you are on Toucan Toco's cloud, it is:
```
[email protected]
Expand Down Expand Up @@ -68,7 +68,7 @@ DATA_PROVIDERS: [
## Data source configuration

* `domain`: str, required
* `name`: str, required. Should match the data provider name
* `name`: str, required. Should match the data provider name
* `spreadsheet_id`: str, required. Id of the spreadsheet which can be found inside
the url: https://docs.google.com/spreadsheets/d/<spreadsheet_id_is_here>/edit?pref=2&pli=1#gid=0,
* `sheetname`: str. By default, the extractor return the first sheet.
Expand Down
2 changes: 1 addition & 1 deletion doc/connectors/hive.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,4 +45,4 @@ DATA_SOURCES: [
,
...
]
```
```
14 changes: 7 additions & 7 deletions doc/connectors/http_api.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,21 @@

This is a generic connector to get data from any HTTP APIs (REST style APIs).

This type of data source combines the features of Python’s [requests](http://docs.python-requests.org/)
This type of data source combines the features of Python’s [requests](http://docs.python-requests.org/)
library to get data from any API with the filtering langage [jq](https://stedolan.github.io/jq/) for
flexbile transformations of the responses.

Please see our [complete tutorial](https://docs.toucantoco.com/concepteur/tutorials/18-jq.html) for
Please see our [complete tutorial](https://docs.toucantoco.com/concepteur/tutorials/18-jq.html) for
an example of advanced use of this connector.

## Data provider configuration

* `type`: `"HttpAPI"`
* `name`: str, required
* `baseroute`: str, required
* `auth`: `{type: "basic|digest|oauth1|oauth2_backend|custom_token_server", args: [...], kwargs: {...}}`
cf. [requests auth](http://docs.python-requests.org/en/master/) and
[requests oauthlib](https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow) doc.
* `auth`: `{type: "basic|digest|oauth1|oauth2_backend|custom_token_server", args: [...], kwargs: {...}}`
cf. [requests auth](http://docs.python-requests.org/en/master/) and
[requests oauthlib](https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow) doc.
* `template`: dict. See below.

```coffee
Expand All @@ -35,8 +35,8 @@ DATA_PROVIDERS: [

### Template

You can use this object to avoid repetition in data sources.
The values of the three attributes will be used or overridden by
You can use this object to avoid repetition in data sources.
The values of the three attributes will be used or overridden by
all data sources using this provider.

* `json`: dict
Expand Down
2 changes: 1 addition & 1 deletion doc/connectors/mssql.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,4 +43,4 @@ DATA_SOURCES: [
,
...
]
```
```
6 changes: 3 additions & 3 deletions doc/connectors/odata.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
* `type`: `"OData"`
* `name`: str, required
* `baseroute`: str, required
* `auth`: `{type: "basic|digest|oauth1|oauth2_backend|custom_token_server", args: [...], kwargs: {...}}`
cf. [requests auth](http://docs.python-requests.org/en/master/) and
[requests oauthlib](https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow) doc.
* `auth`: `{type: "basic|digest|oauth1|oauth2_backend|custom_token_server", args: [...], kwargs: {...}}`
cf. [requests auth](http://docs.python-requests.org/en/master/) and
[requests oauthlib](https://requests-oauthlib.readthedocs.io/en/latest/oauth2_workflow) doc.

```coffee
DATA_PROVIDERS: [
Expand Down
2 changes: 1 addition & 1 deletion doc/connectors/oracle_sql.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Alternatively, you can refer to the Oracle website [installation instructions](h

* `type`: `"OracleSQL"`
* `name`: str, required
* `dsn`: str following the [DSN pattern](https://en.wikipedia.org/wiki/Data_source_name), required. The `host`, `port` and `service name` part of the dsn are required. For example: `localhost:80/service`
* `dsn`: str following the [DSN pattern](https://en.wikipedia.org/wiki/Data_source_name), required. The `host`, `port` and `service name` part of the dsn are required. For example: `localhost:80/service`
* `user`: str
* `password`: str
* `encoding`: str
Expand Down
2 changes: 1 addition & 1 deletion doc/connectors/revinate.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,4 +51,4 @@ DATA_SOURCES: [
params: '<params>',
filter: '<filter>'
]
```
```
2 changes: 1 addition & 1 deletion doc/connectors/sap_hana.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,4 +39,4 @@ DATA_SOURCES: [
,
...
]
```
```
2 changes: 1 addition & 1 deletion doc/connectors/snowflake.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,4 +43,4 @@ DATA_SOURCES: [
,
...
]
```
```
2 changes: 1 addition & 1 deletion doc/connectors/wootric.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,4 +45,4 @@ DATA_SOURCES: [
]
```

For more information, check [the api documentation](https://docs.wootric.com/api)
For more information, check [the api documentation](https://docs.wootric.com/api)
6 changes: 3 additions & 3 deletions doc/generate.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Script to generate a connector documentation.
import collections
import sys
import os
import sys
from contextlib import suppress

import toucan_connectors
Expand Down Expand Up @@ -89,7 +89,7 @@ def generate(klass):
li.append(',\n ...\n]\n```')
doc.append('\n'.join(li))

return '\n\n'.join([l for l in doc if l is not None])
return '\n\n'.join([line for line in doc if line is not None])


def get_connectors():
Expand All @@ -115,7 +115,7 @@ def generate_summmary(connectors):
connectors = collections.OrderedDict(sorted(connectors.items()))
for key, value in connectors.items():
doc.append(f'* [{key}](connectors/{value}.md)')
doc = '\n\n'.join([l for l in doc if l is not None])
doc = '\n\n'.join([line for line in doc if line is not None])
file_name = 'doc/connectors.md'
with open(file_name, 'w') as file:
file.write(doc)
Expand Down
2 changes: 1 addition & 1 deletion templates/cap.m4
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
define(`upcase', `translit(`$*', `a-z', `A-Z')')dnl
define(`downcase', `translit(`$*', `A-Z', `a-z')')dnl
define(`cap', `regexp(`$1', `^\(\w\)\(\w*\)', `upcase(`\1')`'downcase(`\2')')')dnl
define(`cap', `regexp(`$1', `^\(\w\)\(\w*\)', `upcase(`\1')`'downcase(`\2')')')dnl
1 change: 0 additions & 1 deletion templates/tests.py.m4
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,3 @@ from toucan_connectors.downcase(TYPE).downcase(TYPE)_connector import cap(TYPE)C

def test_get_df():
pass

2 changes: 1 addition & 1 deletion tests/google_analytics/fixtures/reports.json
Original file line number Diff line number Diff line change
Expand Up @@ -76,4 +76,4 @@
}
}
]
}
}
2 changes: 1 addition & 1 deletion tests/micro_strategy/fixtures/fixture.json

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion tests/mongo/fixtures/docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,4 @@
"language": "German",
"value": 17
}
]
]
2 changes: 1 addition & 1 deletion tests/odata/fixtures/records.json
Original file line number Diff line number Diff line change
@@ -1 +1 @@
[{"CustomerID":"FOLIG","EmployeeID":8,"Freight":11.26,"OrderDate":"1997-01-08T00:00:00Z","OrderID":10408,"RequiredDate":"1997-02-05T00:00:00Z","ShipAddress":"184, chauss\u00e9e de Tournai","ShipCity":"Lille","ShipCountry":"France","ShipName":"Folies gourmandes","ShipPostalCode":"59000","ShipRegion":null,"ShipVia":1,"ShippedDate":"1997-01-14T00:00:00Z"},{"CustomerID":"VINET","EmployeeID":3,"Freight":11.08,"OrderDate":"1997-11-12T00:00:00Z","OrderID":10739,"RequiredDate":"1997-12-10T00:00:00Z","ShipAddress":"59 rue de l'Abbaye","ShipCity":"Reims","ShipCountry":"France","ShipName":"Vins et alcools Chevalier","ShipPostalCode":"51100","ShipRegion":null,"ShipVia":3,"ShippedDate":"1997-11-17T00:00:00Z"},{"CustomerID":"BONAP","EmployeeID":1,"Freight":11.06,"OrderDate":"1997-05-02T00:00:00Z","OrderID":10525,"RequiredDate":"1997-05-30T00:00:00Z","ShipAddress":"12, rue des Bouchers","ShipCity":"Marseille","ShipCountry":"France","ShipName":"Bon app'","ShipPostalCode":"13008","ShipRegion":null,"ShipVia":2,"ShippedDate":"1997-05-23T00:00:00Z"}]
[{"CustomerID":"FOLIG","EmployeeID":8,"Freight":11.26,"OrderDate":"1997-01-08T00:00:00Z","OrderID":10408,"RequiredDate":"1997-02-05T00:00:00Z","ShipAddress":"184, chauss\u00e9e de Tournai","ShipCity":"Lille","ShipCountry":"France","ShipName":"Folies gourmandes","ShipPostalCode":"59000","ShipRegion":null,"ShipVia":1,"ShippedDate":"1997-01-14T00:00:00Z"},{"CustomerID":"VINET","EmployeeID":3,"Freight":11.08,"OrderDate":"1997-11-12T00:00:00Z","OrderID":10739,"RequiredDate":"1997-12-10T00:00:00Z","ShipAddress":"59 rue de l'Abbaye","ShipCity":"Reims","ShipCountry":"France","ShipName":"Vins et alcools Chevalier","ShipPostalCode":"51100","ShipRegion":null,"ShipVia":3,"ShippedDate":"1997-11-17T00:00:00Z"},{"CustomerID":"BONAP","EmployeeID":1,"Freight":11.06,"OrderDate":"1997-05-02T00:00:00Z","OrderID":10525,"RequiredDate":"1997-05-30T00:00:00Z","ShipAddress":"12, rue des Bouchers","ShipCity":"Marseille","ShipCountry":"France","ShipName":"Bon app'","ShipPostalCode":"13008","ShipRegion":null,"ShipVia":2,"ShippedDate":"1997-05-23T00:00:00Z"}]
1 change: 0 additions & 1 deletion tests/postgres/fixtures/world_postgres.sql
Original file line number Diff line number Diff line change
Expand Up @@ -5385,4 +5385,3 @@ COMMIT;
ANALYZE city;
ANALYZE country;
ANALYZE countrylanguage;

2 changes: 1 addition & 1 deletion tests/trello/fixtures/fixture.json
Original file line number Diff line number Diff line change
Expand Up @@ -283,4 +283,4 @@
"name": "titi"
}
]
}
}
2 changes: 1 addition & 1 deletion toucan_connectors/aircall/Aircall.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion toucan_connectors/install_scripts/psql.sh
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ fi



touch ~/odbcdriver-installed
touch ~/odbcdriver-installed

0 comments on commit 5daa436

Please sign in to comment.