Job Description
Primary Responsibilities for a Big Data Engineer:
- Development of pipelines using Nifi and Spark
- Performance tuning and optimization of pipelines/queries and Spark jobs
- Perform data profiling and analysis to uncover data quality issues
- Maintenance and Support of ETL Data ops and DevOps
Qualifications for a Big Data Engineer:
- Minimum 2 years of experience in data lake and ELT development (Cloudera, Hortonworks)
- Bachelor's Degree or equivalent experience in Computer Science, Technology, or a related field of study
- Hands-on experience in Apache Nifi and/or Cloudera CFM is highly preferred
- Strong SQL development skills
- Possess ability to analyze business requirements and ask clarifying questions.
- Experience creating test plans, testing, and resolving data discrepancies
- Able to work independently with minimum supervision and provide a structured development effort to ongoing/new projects.