I’m a data enthusiast who’s all about soaking up knowledge and getting my hands dirty in data science, engineering, and coding. I have substantial experience in crafting end-to-end analytics pipelines in Google Cloud Platform, specializing in cost-effective serverless frameworks. I love diving into cool projects like using machine learning for some next-level analytics and crafting websites on the side.
Jul 2023 - Present, UK, Remote
I am currently building Jetty’s data infrastructure from scratch. Jetty is a white-label customer management and billing platform for gigabit fibre-to-the-home and other digital services.
Oct 2024 - Present
Jul 2023 - Oct 2024
Forecast is a data consultancy with high profile clients across several industries. For the majority of time, my client was the loyalty department of one of the largest companies in the UK telecommunications industry.
St. James’s Place is a FTSE 100 financial company specialising in Wealth Management. There are several child companies called Partners within the company, spread over the UK.
2015-2019 BSc Honors Mathematics (First Class)Mark: 75 out of 100Final Year Subjects
| ||||||||||||||||
2009-2015 Higher Secondary School CertificateHigher Subjects
|
To support the transition from a single-client setup to a multi-client model, I built a centralized Terraform repository for our analytics infrastructure on GCP. Working closely with Google’s DevOps team, I implemented CI/CD using Cloud Build to automate and standardize environment creation. This transformation allowed us to scale efficiently, spin up client-specific environments with ease, and significantly improved the reliability of our cloud infrastructure.
To address persistent data quality issues, I implemented the dq-utils dbt package to log failing test results and link them to the original source data. I then built a Looker dashboard that provided real-time visibility into data issues, trend analysis, row-level drilldowns, and Jira integration for issue tracking. This solution enabled proactive data ownership, alerting system owners when quality thresholds were breached, significantly improving the overall reliability of our data.
I designed and built a unified data model in dbt to consolidate all customer “trade” events, including sales, package changes, and churn, into a single source of truth. Collaborating with operational teams, I mapped user journeys to define event logic, implemented rigorous testing and documentation, and created a Looker dashboard that now serves as a trusted operational reporting tool. This solution enables the business to track customer movement, identify underperforming sales channels, and make data-driven decisions with confidence.
To address inconsistent and duplicated reporting requests, I developed a lightweight intake process based on BEAM data modeling principles. This involved collaborative sessions with stakeholders to define key user stories and metrics, an Excel mock dataset to map expected dimensions, and an ER model to assess feasibility. The process improved clarity, reduced duplicate requests, and enabled better prioritisation — ultimately streamlining reporting workflows and strengthening alignment between data and business teams.
I designed and developed a website for a jungle trekking company in Sumatra, aimed at boosting their online presence and increasing direct bookings. Over a 6-week sprint, I led the project end-to-end — from photography and design to content strategy and marketing — gathering regular feedback to iterate quickly. The result was a visually engaging, user-friendly site that significantly improved the company’s online booking rates and became a major driver of new income.
Analytics engineering business reporting pipeline built for a large telecommunications company. Written using best-practice data modelling, robust testing, and comprehensive documentation. Features include incremental models to handle big data, separate environments for dev/prod, testing legitimacy of manually entered data as pre-hook and BQML analysis running at the end of the pipeline.