- The youtube.py has the complete code to run the Streamlit Application.
- Before we run the Python code, please make sure the schemas are bulit. Please create the required tables from the Youtube Warehouse Schemas.SQL file.
- Run the above mentioned file in MySQL.
- Now, coming to the Python, we need an API key. Please generate the API key using Google API Services.
- In the main block, you can enter your API key.
- Also, make sure you have all the MySQL connection details ready since there are instances where you need to enter your localhost credentials.
- We are also Mongo DB, please make sure that localhost of MongoDB is working.
- The code is written in 3 Classes.
- The First Class - youtube_harvest() is responsible for geting all the raw data from youtube using API and pushing it to MongoDB.
- The Second Class - migration() is responsible in migrating the selected channels' (from Streamlit web App) information from MongoDB to MySQL Warehouse.
- The Third Class - Analysis() is responsible in giving the result of the selected query (from Streamlit web App). These results are again shown on the Web Application itself.
- The Analysis Queries.SQL is attached for reference which are used in Python code for the listed queries.
-
Notifications
You must be signed in to change notification settings - Fork 0
SaiNishant/Youtube-Harvest
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
GUVI
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published