Data Engineer · Chennai, IN

Pipelines that ship revenue, not just reports.

I'm Abhinav — 4+ years architecting ETL/ELT on Snowflake, Azure, Palantir Foundry, and Airflow. Currently moving enterprise Martech datasets across the stack at AT&T, making every campaign measurably faster and more accurate.

+40% pipeline perf +3.6% YoY conversion 4+ yrs AWS Cloud Practitioner
Palantir Foundry source SQL Server source Azure Blob staging UniBasic legacy Snowflake xform · warehouse Airflow DAGs xform · orchestration Infoworks + SQL xform · modeling AT&T Campaigns sink · abandon cart · iPhone launch · wireless & broadband
Stage 01 · summary

A pipeline I've been running for 4 years.

task: about · state: SUCCESS
duration: 4.3yr · retries: 0

I treat pipelines like products — with SLAs, observability, and a changelog. Day-to-day I'm moving enterprise Martech datasets between Foundry, Azure, and Snowflake, and writing the Airflow DAGs that feed every downstream campaign.

What I care about: query plans that don't lie, schemas that age well, and 3am pages that never happen. Previously I hand-translated a 40-year-old UniBasic lease-accounting system onto modern SQL — so I have patience for legacy too.

0%
0%
pipeline performance lift from the Foundry → Snowflake migration at AT&T Martech.
0%
0%
YoY conversion lift on AT&T's Abandon-Cart campaign — 100% record accuracy in flight.
Stage 02 · skills

Stack, mapped.

heatmap · proficiency × domain
rows=5 · cols=5
proficiency_matrix · SELECT domain, tool, level FROM skills5 × 5 · 25 cells
warehouse
orchestration
modeling
ingestion
reporting
cloud
Snowflake
Airflow
Foundry
Azure Blob
AWS S3
sql
SnowSQL
procs/DAGs
DDL/DML
ETL SQL
reporting
tools
Snowsight
Infoworks
DBeaver
SSMS
Pentaho
languages
SQL
Python
UniBasic
Shell
ops
Git
Azure Repos
JIRA
Confluence
Tableau
proficiency
— beginner → expert
Stage 03 · transformation

Where the work happens.

2 upstream tasks · state: RUNNING · HEAD=att_martech
1

Data Engineer & Snowflake Developer

Tech Mahindra — AT&T Martech Telecom · Aug 2023 — Present
Palantir Foundry SQL Server Azure Blob Snowflake Airflow DAGs Tableau · AWS S3
  • Architected and optimized ETL pipelines for AT&T's Abandon-Cart campaign — +3.6% YoY conversion lift, 100% record accuracy.
  • Led end-to-end migration of Martech datasets from Palantir Foundry → Snowflake (Parquet extracts, Azure Blob staging, SnowSQL ingest) — +40% pipeline perf.
  • Built and managed Apache Airflow DAGs to orchestrate ETL jobs; automated S3 up/downloads for vendor consumption.
  • Shipped data engineering for the iPhone Early Access launch; maintained Tableau dashboards for pipeline health.
  • Designed Infoworks pipelines for cross-platform replication; authored ER diagrams and solution docs at 100% client compliance.
2

SQL Server Developer

Tech Mahindra — Bank of the West BFSI · Sep 2021 — Jul 2023
UNIDATA / PICK UniBasic IDS Web App Pentaho Reports
  • Migrated the InfoLease lease-accounting app from legacy UNIDATA/PICK to a modern IDS Web Application — reverse-engineering UniBasic into optimized SQL.
  • Designed & published business reports in Pentaho Report Designer; unit-tested with Beyond Compare to guarantee post-migration data integrity.
Stage 04 · schema

Foundational tables.

1 degree · 4 certifications · source-of-truth

B.E. Computer Science

SRM Institute of Science & Technology, Chennai · 2015 — 2019

Four years of DBMS, systems & algorithms — the primary keys on every query plan I read today.

certifications

current · verified
AWS Cloud Practitioner
RPA Developer — Foundation
Python Essential Training
SQL Essential Training