Skip to content

Commit

Permalink
Time Series QA: Make notebooks self-contained, also adding DDL and DML
Browse files Browse the repository at this point in the history
Otherwise, people or QA jobs invoking individual notebooks, or in a
different order, are having a hard time.
  • Loading branch information
amotl committed Mar 19, 2024
1 parent c7c0fbe commit 3851f9b
Show file tree
Hide file tree
Showing 5 changed files with 58 additions and 7 deletions.
27 changes: 26 additions & 1 deletion topic/timeseries/exploratory_data_analysis.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -102,12 +102,37 @@
"engine = sa.create_engine(CONNECTION_STRING, echo=os.environ.get('DEBUG'))"
]
},
{
"cell_type": "markdown",
"source": [
"First, import data into CrateDB. This is a shorthand notation for the same code\n",
"illustrated in `timeseries-queries-and-visualization.ipynb`, running corresponding\n",
"SQL DDL and DML statements, to load the data."
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": null,
"outputs": [],
"source": [
"from cratedb_toolkit.datasets import load_dataset\n",
"\n",
"dataset = load_dataset(\"tutorial/weather-basic\")\n",
"dataset.dbtable(dburi=CONNECTION_STRING, table=\"weather_data\").load()"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "markdown",
"id": "cdae15fa",
"metadata": {},
"source": [
"The next step fetches data from CrateDB and load it into a pandas data frame:"
"Then, load data from CrateDB into a pandas data frame:"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions topic/timeseries/requirements-dev.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Real.
# pueblo[notebook,testing]>=0.0.7
pueblo[notebook,testing]>=0.0.9

# Development.
pueblo[notebook,testing] @ git+https://github.com/pyveci/pueblo.git@amo/testbook
# pueblo[notebook,testing] @ git+https://github.com/pyveci/pueblo.git@amo/testbook
1 change: 1 addition & 0 deletions topic/timeseries/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
crate[sqlalchemy]==0.34.0
cratedb-toolkit[datasets]==0.0.7
refinitiv-data<1.7
pandas<2
pycaret>=3.0,<3.4
Expand Down
27 changes: 26 additions & 1 deletion topic/timeseries/time-series-decomposition.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -106,12 +106,37 @@
"engine = sa.create_engine(CONNECTION_STRING, echo=os.environ.get('DEBUG'))"
]
},
{
"cell_type": "markdown",
"source": [
"First, import data into CrateDB. This is a shorthand notation for the same code\n",
"illustrated in `timeseries-queries-and-visualization.ipynb`, running corresponding\n",
"SQL DDL and DML statements, to load the data."
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "code",
"execution_count": null,
"outputs": [],
"source": [
"from cratedb_toolkit.datasets import load_dataset\n",
"\n",
"dataset = load_dataset(\"tutorial/weather-basic\")\n",
"dataset.dbtable(dburi=CONNECTION_STRING, table=\"weather_data\").load()"
],
"metadata": {
"collapsed": false
}
},
{
"cell_type": "markdown",
"id": "cdae15fa",
"metadata": {},
"source": [
"The next step fetches data from CrateDB and load it into a pandas data frame:"
"Then, load data from CrateDB into a pandas data frame:"
]
},
{
Expand Down
6 changes: 3 additions & 3 deletions topic/timeseries/timeseries-queries-and-visualization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -200,9 +200,9 @@
"id": "226e67f8",
"metadata": {},
"source": [
"After inserting data, it is recommended to `ANALYZE` the tables to make the query optimizer obtain\n",
"important statistics information about them. Let's also invoke a `REFRESH` statement beforehand,\n",
"to make sure that the data is up-to-date."
"After inserting data, let's invoke a `REFRESH` statement, to make sure it is\n",
"up-to-date. It is also recommended to `ANALYZE` the tables, to make the query\n",
"optimizer obtain important statistics information about them."
]
},
{
Expand Down

0 comments on commit 3851f9b

Please sign in to comment.