$ whoami
Database, DevOps, and Platform Engineer at Kyndryl, architecting enterprise-grade PostgreSQL 17 and MSSQL infrastructure for mission-critical financial platforms processing 50M+ transactions daily on hybrid Azure & AWS. I build Patroni HA clusters, automate with Terraform, Ansible & Python/FastAPI, and optimize with Crunchy Data tooling. Advanced PostgreSQL training with PostgreSQL core developers. 99.995% uptime across high-volume payment systems.
Database Engineer at Kyndryl with 6+ years in SQL Server and deep expertise in PostgreSQL 17. Currently leading database engineering for enterprise financial platforms processing 50M+ daily transactions on hybrid Azure and AWS cloud. I architect Patroni HA clusters, build CI/CD pipelines, and implement full observability stacks using Grafana, OpenSearch, and Prometheus exporters.
My philosophy is simple: infrastructure should be immutable, observable, and self-documenting. Every database I manage is provisioned through Terraform, configured via Ansible, and automated with Python/FastAPI and Bash scripts. I optimize PostgreSQL performance using Crunchy Data tooling and ensure mission-critical financial systems achieve 99.995% uptime.
Completed intensive advanced PostgreSQL training directly with PostgreSQL core developers, gaining deep expertise in PostgreSQL 17 internals, performance tuning, and enterprise patterns. When you've managed 9+ petabytes across financial systems with zero unplanned downtime, you understand the value of continuous learning and operational excellence.
Specializing in Data Analytics, Database Systems & Machine Learning
DBMS, Machine Learning, AI, Software Engineering, Data Structures, Networks
Automation Engineering @ Cotiviti — Built automated disaster recovery system for Vertica databases
Client: NAB Bank
Enterprise-grade solution automating database backups across 200+ databases with 85% reduced verification time.
LSTM neural network predicting storage requirements across 330+ databases with 80% accuracy.
ETL pipeline transferring 2TB+ daily from Oracle to Vertica using PySpark and Apache Kafka.
End-to-end streaming pipeline using PySpark and Medallion Architecture with ML forecasting models.
Python validation framework automating migration verification for 50+ databases — 95% faster validation.
XGBoost forecasting with time-series analysis and weather integration for Australian energy markets.
DAMA-DMBOK maturity evaluation identifying gaps in storage, master data, and warehousing with AI policy recommendations.
Data preprocessing, cleaning, and visualization dashboard built with Excel and Power BI.
ML pipeline using ensemble learning and anomaly detection — 94% accuracy on DE-SynPUF dataset.
Web scraper and high-performance database for greyhound race results and betting analytics.
2026
2026
Open to opportunities in database administration, cloud infrastructure, and data engineering. Let's connect — I respond to all queries.