Summary: The Senior Azure Data Factory Engineer is responsible for designing and implementing scalable data ingestion pipelines from various sources, ensuring data quality and compliance. The role involves automating workflows, optimizing data models, and maintaining cloud data solutions using Azure technologies. The engineer will also oversee data governance and security while troubleshooting performance issues. Strong experience in data processing, automation, and cloud platforms is essential for success in this position.
Key Responsibilities:
- Design and implement scalable data ingestion pipelines from diverse sources including APIs, SharePoint, on-premise systems, and file-based sources.
- Perform data cleansing, validation, and transformation to produce high-quality, reliable datasets.
- Develop and maintain data migration and archival strategies ensuring accuracy, integrity, and compliance.
- Automate ingestion and processing workflows using Python, PowerShell, and orchestration tools.
- Build and maintain solutions using Azure Data Factory, Blob Storage, ADLS, and Azure SQL.
- Oversee data quality frameworks ensuring accuracy, consistency, and integrity.
- Optimize pipelines, SQL queries, and storage layers while troubleshooting performance issues.
- Implement audit logging, data lineage, and compliance practices.
Key Skills:
- Strong experience in ingestion, migration, archival, pipeline automation, and large-scale data processing.
- Advanced SQL expertise including stored procedures, indexing, and performance tuning.
- Hands-on experience with Python and PowerShell for ETL/ELT and automation.
- Experience with Azure Data Factory, ADLS, Blob Storage, and Azure SQL.
- Expertise in data modeling, logical/physical quality frameworks, and optimization.
- Ability to work with structured, semi-structured, and unstructured data formats.
- Strong knowledge of audit logging, lineage, cataloguing, metadata management, and security.
- Demonstrated automation mindset including use of AI agents.
- Hands-on experience or strong understanding of modern cloud data warehousing including Snowflake fundamentals.
Salary (Rate): undetermined
City: London Area
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Role Sr Azure Data Factory Engineer
Data Engineering Processing
Design and implement scalable data ingestion pipelines from diverse sources including APIs SharePoint on premise systems and file based sources
Perform data cleansing validation and transformation to produce high quality reliable datasets
Develop and maintain data migration and archival strategies ensuring accuracy integrity and compliance
Build and optimise logical and physical data models
Handle diverse data structures and formats including BAK MDF CSV JSON XML and Parquet
Automation Orchestration
Automate ingestion and processing workflows using Python PowerShell and orchestration tools
Apply an automation first mindset including experience integrating AI agents for workflow automation
Cloud Data Platforms Azure
Build and maintain solutions using Azure Data Factory Blob Storage ADLS and Azure SQL
Implement data lineage cataloguing and governance capabilities
Data Quality Security Compliance
Oversee data quality frameworks ensuring accuracy consistency and integrity
Implement audit logging data lineage and compliance practices
Maintain strong security and governance controls
Performance Scalability
Optimize pipelines SQL queries and storage layers
Troubleshoot performance issues across compute and storage
Must Have Skills Data focused Essentials
Strong experience in ingestion migration archival pipeline automation and largescale data processing
Advanced SQL expertise including stored procedures indexing and performance tuning
Handson experience with Python and PowerShell for ETLELT and automation
Experience with Azure Data Factory ADLS Blob Storage Azure SQL
Expertise in data modelling logical physical quality frameworks and optimisation
Ability to work with structured semi structured and unstructured data formats
Strong knowledge of audit logging lineage cataloguing metadata management and security
Demonstrated automation mindset including use of AI agents
Handson experience or strong understanding of modern cloud data warehousing including Snowflake fundamentals such as virtual warehouses micro partitioning query optimization and role based access control
Good to Have Skills Infrastructure Advanced Platforms Infrastructure DevOps
Experience with Terraform Azure DevOps YAML pipelines and cloud automation
Exposure to Azure Function Apps serverless compute and orchestration
Understanding of infrastructure as code and cloud deployment patterns
Exposure to Docker Kubernetes
Advanced Data Platforms Tools
Deep or hands on exposure to Snowflake including Creating and managing Snowflake objects databases schemas roles
Using Snow pipe for automated ingestion
Performance tuning using clustering caching and micro partitioning
Understanding Snowflake cost optimization and storage compute separation
Experience with Databricks for advanced data engineering workflows
Familiarity with Denodo or Microsoft Purview
Certifications Azure or Snowflake certifications are strong advantages