Responsibilities:
-Designs, Develops, and Implements data models to visualize and create a big picture of the data needs of business use cases.
-Performs, tests, validates data flows and prepares ETL processes according to business requirements by incorporating business requirements into the design specifications.
-Core concepts of Data warehousing (Specifically about Kimball’s Methodology of Data Warehousing).
-Experience in creating and implementing Data Models, robust Data Architecture with industry standards.
-Extensive ETL/ELT experience, preferably using Talend Studio and MS Fabric.
-Expert knowledge with RDBMS based databases including MySQL, PostgreSQL
-Ability to write ad hoc SQL queries & optimize them using various methods.
-Experience with Python is required.
-Experience required in Apache Airflow or any other data orchestration tool.
-Knowledge of Big Data Framework such as Databricks, Hadoop Ecosystem and Azure Data Engineering Ecosystem.
-Good to have experience in PowerBI.
-Knowledge of new tools in the market specifically for Data Engineering.
-Ability to work independently and in time-sensitive environments and good communication and interpersonal skills.