My name's John, a data science enthusiast, especially in computer vision. Learned to code on my own since last year, i have developed strong proficiency in Python, SQL, and R, along with expertise in data analytics and machine learning.
Skill | Stack |
---|---|
Programming Language | |
Framework | |
Visualization Tools | |
Database | |
Deployment | |
Other Tools |
A chatbot made for Carsome, a vehicle marketplace. The chatbot designed for assisting customer for administrative and car recommendation question. The objective of this project is to create a chatbot application (Virtual Shopping Assistance) that can provide 24/7 customer service and can provide product recommendations to increase revenue based on customer experience, LTV, and AOV variables.
In this project, we want to create a system to automate ETL task, and monitor real time data inside the database using Kibana. The data we'll be using is Data Science job posts from Glassdoor (the dataset itself was from Kaggle). Once the data that we gathered were cleaned and transformed on Airflow, then it is loaded to database, and uploaded to Elasticsearch for monitoring through Kibana.
The goal is to predict hand sign when playing rock paper scissors. We'll use convolutional neural networks (CNN) to train the model, and we'll transform the image in the dataset to only show the edge of the hand, to increase the model's performance.
We want to create a machine learning model that can classify bank loan applicants whether he / she is a risky or non-risky applicants. In this project we'll use five different classification model, which is K-Nearest Neighbour (KNN), Support Vector Machine (SVM), Decision Tree, Random Forest, and XGBoost Classifier. After identifying the best base model, we'll fine-tune its parameters to improve its accuracy too. Finally, we'll compare the tuned model with the base one to choose the best option for predicting loan approval accurately and reliably.
Created a problem identification method by using SMART framework (Specific, Measurable, Achievable, Relevant, Time-bound) for a ficticious company and creating an exploratory data analysis based from the dataset.
The goal is to web scrape all tokopedia product (in this instance is seblak
) and then transform all the information we gathered to tabular data for data analysis.