Senior Data Engineer

Job Description

TEO People is seeking a highly skilled Senior Data Engineer to join our dynamic team. We are looking for an innovative professional who can design, build, and optimize scalable data pipelines, ensuring data-driven decisions and excellence across our projects.

Job Title: Senior Data Engineer
Experience: 5+years
Location: F-11, Islamabad (Onsite)


Responsibilities:
Design, develop, and optimize scalable ETL/ELT data pipelines using Azure Data Factory and Azure Databricks (PySpark).
Build and manage end-to-end data integration workflows across multiple data sources and target systems (e.g., SQL Server, Oracle, flat files).
Design and execute the migration of legacy data warehouse workflows to a modern Lakehouse platform.
Develop data mapping, transformations, and ingestion processes across enterprise data systems.
Ensure data quality, validation, governance, and reliability across all data pipelines.
Optimize PySpark and Spark SQL workloads for performance and cost efficiency.
Work with both batch and streaming data processing workloads.
Maintain and enhance Python/PySpark-based data frameworks, driving scalability and continuous improvement. • Experience in deployments automation using CI/CD pipelines and Azure DevOps.
Collaborate with data architects, analysts, and business stakeholders to translate requirements into technical solutions.
Monitor, troubleshoot, and resolve production pipeline issues.

Requirements:
5+ years of hands-on experience in data engineering.
Strong expertise in Azure Databricks, PySpark, Spark SQL, and Azure Data Factory.
Experience working with Power Bi reports.
Experience working with relational databases such as SQL Server and Oracle.
Strong proficiency in SQL (performance tuning, complex queries, CTEs, window functions).
Solid understanding of traditional Enterprise Data Warehouse (EDW) concepts and data modeling (star/snowflake schemas).
Experience with Delta Lake, Unity Catalog, and Databricks cluster management.
Experience building metadata-driven pipelines.
Knowledge of cloud architecture, distributed computing, and big data concepts.
Familiarity with data governance, cataloging, and security best practices.

Nice to Have:
Experience with Microsoft Fabric (OneLake, Lakehouse, Data Factory, Synapse integration).
Familiarity with containerization technologies such as Docker and Kubernetes.
Knowledge of workflow orchestration tools (Airflow, Prefect, Dagster).
Experience with real-time event streaming platforms (Kafka, Azure Event Hubs).
Experience with API-based data ingestion and REST integrations.

Share your resume at arooj.niazi@teo-intl.com or directly through LinkedIn. Do mention position in the subject.

Job Summary

  • Published on:2026-01-28 1:30 pm
  • Vacancy:1
  • Employment Status:Full Time
  • Experience:5 Years
  • Job Location:Islamabad
  • Gender:No Preference
  • Application Deadline: 2026-03-14