This tutorial is meant to be a whirlwind introduction to the broad landscape of artificial intelligence. Although we often talk about AI as a monolith, it turns out it's a large collection of techniques built for solving specific classes of problems.
In the tutorial, we'll explore those techniques by looking at different off the shelf tools for their specific areas. The goal is to be an inch deep and a mile wide; we want to install and use the tools on a small set of problems to:
- get a feel for that process
- see the breadth of the AI landscape
- start to build a mapping from real world problems to AI techniques
The tools we're using range from "published library which is really just part of someone's dissertation" to "well maintained open source library". It's intentional, because it's representative of reality.
-
An environment management tool for
python
- the demo uses
venv
- the demo uses
-
pip
for managing python libraries -
python
, the following libraries, and their supportsmatplotlib
sklearn
pytorch
torchvision
vaderSentiment
nltk
- You'll want to run
nltk.download all
before the day of the tutorial
- You'll want to run
gymnasium
- You'll want to pip install gymnasium[classic_control] as well
pathfinding
-
The following datasets
- NIST digit data (included with sklearn)
- Better Resolution NIST digit data (downloadable with torchvision)
datasets.MNIST('data', download=True, transform=transform, train=True)
dataset.MNIST('data', download=True, transform=transform, train=False)
- ICMDB Sentiment dataset
- Pathfinding maps and scenarios
Currently, there are setup scripts available for any system that can
run apt
, which is to say many linux distros. These were originally
built to configure virtual machine images or remote boxes (e.g. on
digital ocean) for use in the tutorial. They could also be useful to
you in getting your own machine configured for the tutorial.
Downloads general development tools and installs various apt-get
able
python libraries for use in the tutorials.
Sets up a virtual environment for the natural language processing
tutorial and pip
installs several python libraries used in the NLP
tutorial, specifically:
vaderSentiment
- VADER sentiment analysis toolnltk
- The natural language tool kit
Further, it downloads the IMDB sentiment dataset from standard and
unpacks it into ./nlp/data/
Sets up a virtual environment for the planning tutorial and pip
installs the python libraries used in the tutorial, specifically:
gymnasium
- formerly OpenAI.gymnasium, a reinforcement learning frameworkgymnasium[classic_control]
- the classic control example domainstqdm
- a terminal-based progress bar gymnasium relies onpathfinding
- a 2D pathfinding library
It also downloads a 2D pathfinding dataset (maps from the Baldur's Gate
series) and unpacks them to ./planning/maps
and
./planning/scenarios
- Indy.Code() 2023
- CodeMash 2024
- Jordan Thayer
- Robert Herbig
- Will Trimble
- Lee Harold