This program will identify and remove C-style comments from an input test file using a deterministic finite state automoton (DFA) before using another DFA to convert the input file into a series of tokens.
Authored by: Blake Marshall, Brandon Robinson, Holden Ea, Rolando Yax, and Jacob Sellers.
This project can be run via make.
make
./tokenize.x
If you have a Windows based machine, you will need to adjust the Makefile to generate a .exe
executable, rather than .x
.