Home | Find Work | Azure Data Factory

Azure Data Factory

Job Details

Job Description

Must-Have;

  • Experience working with at least 1 project on Azure Data Factory
  • Working experience in Azure Data Engineering
  • Primary skills – Python, R, Scala, SPARK, Pi-SPARK – one or more  of these required
  • Experience with the DataFrame API
  • Understanding of Machine Learning concepts
  • Working knowledge – cloud era, Jupiter notebook, data bricks workbenches – one of them is required
  • ETL Tools like Informatica or BODS
  • Good knowledge of APIs and ingesting Data from SAP and non SAP systems
  • Understand Delta Lakes
  • Strong concepts of DW
  • Troubleshooting skills

Good-to-Have

  • DevOps experience
  • Operations support experience
  • Good SQL knowledge
  • Good communication skills
  • Must be a team player

Responsibility of / Expectations from the Role

  • Monitor/Manage software components deployed in Azure data factory-like data integration and data transformation
  • Build simple to complex pipelines & dataflows
  • Need to Understand business requirements and actively provide inputs from a Data perspective
  • Need to Understand the underlying data and flow of data
  • Work with other Azure stack modules like Azure Data Lakes, SQL DW, etc.
  • Should be able to implement modules that have security and authorization frameworks
  • Recognize and adapt to the changes in processes as the project evolves in size and function

Key Skills