Data Engineer

Job Description

Wego is looking for a Data Engineer to join our data team. We are a growing global travel tech company and as we scale, so does our data. Data is key to empowering our product, marketing, sales and operations teams to function effectively and efficiently. You will help our team to continue this empowerment through data management, engineering and analysis.

What you will do:

Master data management

Own and develop the core “source of truth” Analytical Databases for Wego. Constantly improve on data structures and analytical DB design as business needs and product evolves. Make as much information from data available as possible to allow the company to drive decisions with data.

Work with stakeholders upstream of the data team such as product and infrastructure to define data structures and schema. Work with stakeholders downstream of the data team to understand analytical and business reporting needs. Hands-on data debugging, development and deployment skills would be required.

Manage and improve data pipelines

Working with open source and cloud technology, you will help manage our ELT pipelines and review and improve to ensure reliability and scalability. Work with upstream engineers to adopt best practices, new solutions and technologies to keep Wego’s data practice up to date with industry leaders.

Build new data pipelines for new product launches. Plan and write new scripts and tasks using SQL, Python and/or Spark via Airflow/Jenkins to transform data into clean, well structured datasets for our analysts to derive insights from.

Technical skills:

  • Experience maintaining scripts and processes to generate analytical datasets.
  • Ability to work in advanced Python and SQL to manipulate and transform data.
  • Ability to run and deploy tasks using deployment tools such as Git / Jenkins / Airflow.
  • Experience working with major Big Data technologies such Bigquery, Redshift, Hadoop / Spark etc.
  • Knowledge of cloud platforms such as Google Cloud Platform, Microsoft Azure, Amazon Web Services (AWS).
  • Experience working with data visualization tools such as Tableau, Qlik, Chartio etc.
  • Knowledge of other languages such as R, Ruby / Javascript would be a plus.
  • Knowledge of advanced statistical predictive models would be a plus.

Requirements:

  • Solid experience developing and working with relational and analytical data structures.
  • Comfortable working within large, complex databases and data warehouses. 
  • Demonstrates sound knowledge of data processing concepts- transforming raw data into processed, ready to use analytical datasets.
  • Experience working on and optimizing data pipelines, ETLs or ELTs.

Job Summary

  • Published on:2021-05-18 4:44 pm
  • Vacancy:1
  • Employment Status:Full Time
  • Experience:2 Years
  • Job Location:Lahore
  • Gender:No Preference
  • Application Deadline:2026-03-21