Skip to content

Commit

Permalink
Update index.md
Browse files Browse the repository at this point in the history
Signed-off-by: pelinkeskin <[email protected]>
  • Loading branch information
pelinkeskin authored Nov 13, 2023
1 parent 340c8e8 commit 1b476de
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion index.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ layout: default
<p>This repository contains personal projects I did in my free time and during my CS Masters's degree.</p>

### [Kaggle Natural Language Processing with Disaster Tweets Competition](https://github.com/pelinkeskin/Personal_projects/tree/main/Kaggle_NLP_Disaster_Tweets) (11/2023)
This directory encompasses my entry for the [Kaggle Natural Language Processing with Disaster Tweets competition](https://www.kaggle.com/competitions/nlp-getting-started). The challenge revolved around constructing a machine learning model adept at distinguishing genuine disaster-related tweets from others. My primary focus throughout this competition was to enhance my proficiency in leveraging TensorFlow for training Deep Neural Networks specifically tailored for natural language processing in text classification. I accomplished a commendable accuracy surpassing 80% with my model. This notebook chronicles my meticulous dataset preprocessing, specifically tailored to suit Deep Neural Network (DNN) training. Initial experimentation involved employing Long Short-Term Memory (LSTM) and training the embedding layer within the network. However, due to the limited size of the training data, this approach led to overfitting. To address this issue, I integrated a 200-dimensional Twitter variant of Stanford's GloVe embeddings. I explored diverse architectures and ultimately opted for a straightforward LSTM-CNN hybrid model. To further enhance the model's performance, I fine-tuned the hyperparameters of the LSTM-CNN hybrid using Bayesian Optimization with Keras Tuner. I'm delighted to share that this notebook is openly accessible on [Kaggle](https://www.kaggle.com/code/pelinkeskin/nlp-practice-tensorflow-lstm-cnn-glove), and I eagerly invite comments and feedback from the community.
This directory encompasses my entry for the [Kaggle Natural Language Processing with Disaster Tweets competition](https://www.kaggle.com/competitions/nlp-getting-started). The challenge revolved around constructing a machine learning model adept at distinguishing genuine disaster-related tweets from others. My primary focus throughout this competition was to enhance my proficiency in leveraging TensorFlow for training Deep Neural Networks specifically tailored for natural language processing in text classification. I accomplished a commendable accuracy surpassing 80% with my model. This notebook chronicles my meticulous dataset preprocessing, specifically tailored to suit Deep Neural Network (DNN) training. Initial experimentation involved employing Long Short-Term Memory (LSTM) and training the embedding layer within the network. However, due to the limited size of the training data, this approach led to overfitting. To address this issue, I integrated a 200-dimensional Twitter variant of Stanford's [GloVe](https://nlp.stanford.edu/projects/glove/) embeddings. I explored diverse architectures and ultimately opted for a straightforward LSTM-CNN hybrid model. To further enhance the model's performance, I fine-tuned the hyperparameters of the LSTM-CNN hybrid using Bayesian Optimization with Keras Tuner. I'm delighted to share that this notebook is openly accessible on [Kaggle](https://www.kaggle.com/code/pelinkeskin/nlp-practice-tensorflow-lstm-cnn-glove), and I eagerly invite comments and feedback from the community.

### [Kaggle Digit Recognizer Competition](https://github.com/pelinkeskin/Personal_projects/tree/main/Kaggle_Digit_Recognizer) (10/2023)
This folder contains my submission for the [Kaggle Digit Recognizer competition](https://www.kaggle.com/competitions/digit-recognizer), which entailed the accurate identification of digits from a dataset comprising tens of thousands of handwritten images. My primary objective in this competition was to refine my skills in utilizing TensorFlow for training Deep Convolutional Neural Networks (CNNs) in the context of image classification tasks. I successfully achieved an accuracy exceeding 99% with my model, securing a position in the top 30% of participants. My network architecture consisted of four convolutional layers and two fully connected layers, inclusive of the output layer. To address the classification challenge, I employed one-hot encoding for the labels, employed categorical cross-entropy as the loss function, and made predictions based on the class with the highest probability. After conducting empirical testing, I opted for the Adam optimizer over RMSProp due to its superior performance. To further enhance the robustness of the model, I integrated dropout and early stopping mechanisms to mitigate overfitting. Additionally, I introduced a learning rate decay on plateaus to achieve improved convergence during training. I am pleased to share that this notebook is publicly accessible on [Kaggle](https://www.kaggle.com/code/pelinkeskin/cnn-with-tensorflow-practice) and welcomes comments and feedback from the community.
Expand Down

0 comments on commit 1b476de

Please sign in to comment.