Hi there 👋, I'm Ibrahim Giwa
A machine learning engineer who is passionate about developing and deploying AI models that address energy-related challenges and contribute to a sustainable future. I strive to leverage innovative technologies to optimize energy systems and enhance efficiency across various applications.
🔭 I’m currently working as a research assistant at Bells University of Technology, focusing on projects related to renewable energy and power systems.
🌱 I’m currently learning advanced machine-learning techniques through an internship with @Optimus AI Labs, while honing my skills in Python, MATLAB, data analysis, neural networks, and model deployment.
👯 I’m looking to collaborate on open-source projects and research initiatives that aim to enhance energy efficiency and sustainability.
📫 How to reach me: [email protected] Linkedin
📄 Know about my experiences: Link to CV
⚡ Fun fact: When I'm not working on projects, I enjoy exploring new technologies, conducting research, and traveling to meet new people and gain diverse experiences.
This was my undergraduate final-year project.
Duration: 6 months (between Jan 2023 and July 2023).
Project Goal
Develop and simulate an efficient machine learning-based wind speed prediction model for renewable energy solutions to power telecommunication base stations in Nigeria.
Project Overview
- Data was collected from the Nigerian Meteorological Agency (NIMET) and contained 10 years (2007-2018) of wind speed dataset from Kano state, Nigeria.
- Preprocessing steps such as Data cleaning and feature engineering were done.
- LSTM, ANN, GRU, and SimpleRNN models were used to train the models where LSTM outperformed other models based on the error metrics.
- The LSTM hyperparameters such as the number of neurons, layers, timesteps, number of engineered features, optimizers, etc, were iterated to optimize the accuracy of the model and were finally compared with other neural network variants.
This was a virtual Hackathon project organized by Optimus AI Labs.
Duration: 5 days (between April 8th and April 12th, 2024).
Project Scope
Develop a User interface model for forecasting energy consumption for buildings, areas, or utilities, as well as predicting renewable energy output from sources such as wind turbines or solar panels, considering weather conditions and environmental factors.
Project Overview
- The dataset was sourced from Kaggle, which comprises two extensive sets spanning four years (2015-2018). It includes Consumption and Generation data, and Weather Features data from the five largest cities in Spain.
- Preprocessing techniques, such as removal of unnecessary columns and missing values, renaming of columns, merging of the both datasets via the date-time columns, checking for outliers, and feature engineering by transforming the date-time into hour, day, month and year columns.
- Trained the dataset with the Random Forest (RF) Regression model which involves experimenting with various Machine Learning (ML) models including LSTM, Random Forest, XGBoost, Gradient Boost, and AdaBoost, but RF model produced the least error metric with a Root Mean Square Error (RMSE) of 3.301.
- The model was deployed using Streamlit library to ensure an easy access user interface as shown below.
This is a master's project I volunteered for during my Mandatory one-year service at Bells University, Ota.
Duration: 2 months (between June and July 2024).
Project Goal
Design and simulate a predictive maintenance model for a 60 MVA power transformer at the Akangba 330kV/132kV transmission sub-station
Project Overview
- Data was collected from one of the 60MVA power transformers at Akangba Transmission Substation, located in Lagos State, Nigeria, and comprised three years (2021-2023) of breakdown and preventive maintenance datasets
- The preprocessing stage involved feature engineering, where necessary features were computed to derive the target variable in the form of binary classification using the mean and standard deviation of some features as a threshold.
- Tree-based models such as Random Forest (RF), XGBoost, LightGBM, and AdaBoost, as well as Artificial Neural Network (ANN) models, were used to train the models. Although all models achieved the same accuracy (97%), the RF classifier model was preferred because it outperformed the others based on its shorter computational time and minimal hyperparameter tuning.
- The RF hyperparameters, such as the number of estimators, minimum samples split, minimum samples leaf, maximum features, maximum depth, and bootstrap, were tuned using the Random Search method to optimize the model's accuracy. The optimized RF model was then compared with other tree-based and ANN models.
In case you want to reach out to me about the portfolio, work opportunities, or collaboration, feel free to contact me on: - LinkedIn - Gmail