Home | Find Work | Data Engineer with Hands on Airflow and Spark Programming

Data Engineer with Hands on Airflow and Spark Programming

Job Details

Job Description

Technical/ Functional Skills

  • IT experience in building data pipelines for Data Analytics programs.
  • Experience in hands-on experience on Airflow and excellent coding skills on Spark programming (PySpark / ScalaSpark).
  • Handson in Data pipeline projects required.

Experience Required

Significant experience in end-to-end implementation of the SDLC; including requirement mapping, finalization of specifications, development, testing and implementation in Spark programming (PySpark / ScalaSpark).

Roles & Responsibilities

  • Responsible for Gathering the Requirements from the Customer.
  • Conducting workshop and creating blueprint document.
  • Must be able to Plan Implementation activities, conversion and cutover etc.
  • Must be able to configure the system as per client’s requirement.
  • Good Communication skill interacting with Client, Business User, IT Person and Offshore team
  • Excellent logical and analytical skills with problem solving ability.
  • Good Team player

Key Skills