A-LEVEL:
Mathematics A*, Further Mathematics A, Physics A,
Mathematics A*, Further Mathematics A, Physics A,
2019 - 2023
1:1 degree
2023 - 2024
expected 2:1 class degree or above
Python, Java, C#, Haskell, GLSL
Pytorch,Tensorflow, Anaconda, Scikit-Learn,Keras,OpenCV, HuggingFace Transformer
April 2022 - July 2022
May 2023 - July 2023
Developed a neural network model to evaluate the rationality of news headlines, focusing on addressing imbalanced data in text classification tasks. Implemented data enhancement techniques including thesaurus-based substitution, sentence paraphrasing, and kernel density estimation (KDE), integrating these with a Bi-LSTM model and Label Distribution Smoothing (LDS) for improved robustness and accuracy. Utilized the Parrot framework for whole sentence paraphrasing, significantly enhancing data distribution and prediction accuracy. Published a paper comparing various data augmentation methods for text classification at the 2022 IEEE 14th ICCRD as a co-first author.
Collaborated with two students to design and develop a smart Android application using IoT technology and advanced deep learning models. The application collects user activity data via wearable sensors for real-time activity recognition and historical data tracking. Implemented a deep learning model integrating Long Short-Term Memory (LSTM) networks with Fully Convolutional Networks (FCN) for multivariate time series classification. Enhanced model accuracy with a squeeze-and-excitation block, achieving a 99% accuracy rate on testing data.
Developed a small AI mental health assistance chatbot to provide initial psychological support and advice using natural language processing. Utilized Python, the Transformers library, and PyTorch framework to fine-tune the T5 model for mental health counseling. Trained the model on a publicly available dataset from Hugging Face, containing mental health-related questions and positive responses.
Conducted a study on image captioning, evaluating the performance of three decoders (LSTM, Attention LSTM, Transformer) with different encoders (ResNet-18, ResNet-50, ResNet-101) under limited training data conditions. Findings indicated that Attention LSTM excelled with limited data, LSTM was effective with simpler encoders, and the Transformer showed potential with complex encoders and more data. These results offer valuable insights for model selection, suggesting the use of LSTM/Attention LSTM for limited data scenarios with short time dependencies, such as stock market predictions.
Implemented a Transformer model with multi-head attention to improve a neural machine translation (NMT) system for translating German to English. Using PyTorch, developed the multi-head attention mechanism from scratch, allowing the model to focus on different parts of the input sentence simultaneously. The project involved implementing the baseline Transformer architecture and extensive training and evaluation to achieve optimal performance.