Job Description: The Senior Data Engineer will develop complex data sources and pipelines for a data platform, primarily using Snowflake and other applications like Azure and Airflow. This role involves collaboration with various teams to align requirements and identify technological gaps, while also ensuring data integrity and reliability through monitoring and automation. The position is contract-based for an initial duration of six months, with a hybrid work structure requiring two days a week in London.
Key Responsibilities:
- Integrate data from multiple on-prem and cloud sources and systems.
- Handle data ingestion, transformation, and consolidation for analysis and reporting.
- Develop data transformation routines to clean, normalize, and aggregate data.
- Apply data processing techniques for complex data structures and prepare data for analysis.
- Implement data de-identification/data masking in line with company standards.
- Monitor data pipelines and systems to detect and resolve issues promptly.
- Develop monitoring tools to automate error handling mechanisms.
- Utilize data quality tools to ensure data accuracy and integrity.
- Create & maintain data pipelines using Airflow & Snowflake.
- Create SQL Stored procedures for complex transformations.
- Understand data requirements and design optimal pipelines.
- Create logical & physical data models to ensure data integrity.
- CI/CD pipeline creation & automation using GIT & GIT Actions.
- Tune and optimize data processes.
Skills Required:
- Bachelor's degree in Computer Science or a related field.
- Proven hands-on experience as a Data Engineer.
- Proficiency in SQL with experience using Window functions and advanced features.
- Excellent communication skills.
- Strong knowledge of Python.
- Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, etc.
- In-depth knowledge of Snowflake architecture and best practices.
- Experience with CI/CD pipelines using Git and Git Actions.
- Knowledge of various data modeling techniques.
- Hands-on experience with developing data pipelines and writing complex SQL queries.
- Experience with Kubernetes and Linux containers.
- Experience with both relational and non-relational databases.
- Analytical and problem-solving skills applied to big data datasets.
- Experience working on projects with agile/scrum methodologies.
- Good understanding of access control and data masking.
- Exposure to DevOps methodology.
- Knowledge of data warehousing principles and architecture.
Salary (Rate): undetermined
City: London
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element of the requirement. They will also need to help identify gaps in resources, technology, or capabilities required, and work with the data engineering team to identify and implement solutions where appropriate.
Work type: Contract
Length: initial 6 months
Work structure: hybrid 2 days a week in London.
Primary Responsibilities:
- Integrate data from multiple on prem and cloud sources and systems.
- Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting.
- Develop data transformation routines to clean, normalize, and aggregate data.
- Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks.
- Implement data de-identification/data masking in line with company standards.
- Monitor data pipelines and data systems to detect and resolve issues promptly.
- Develop monitoring tools to automate error handling mechanisms to ensure data integrity and system reliability.
- Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle.
- Create & maintain data pipelines using Airflow & Snowflake as primary tools
- Create SQL Stored procs to perform complex transformation
- Understand data requirements and design optimal pipelines to fulfil the use-cases
- Creating logical & physical data models to ensure data integrity is maintained
- CI CD pipeline creation & automation using GIT & GIT Actions
- Tuning and optimizing data processes
Qualifications Required Qualifications:
- Bachelor's degree in Computer Science or a related field.
- Proven hands-on experience as a Data Engineer.
- Proficiency in SQL (any flavor), with experience using Window functions and advanced features.
- Excellent communication skills.
- Strong knowledge of Python.
- Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc.
- In-depth knowledge of Snowflake architecture, features, and best practices.
- Experience with CI/CD pipelines using Git and Git Actions.
- Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault.
- Hands-on experience with:
- Developing data pipelines (Snowflake), writing complex SQL queries.
- Building ETL/ELT/data pipelines.
- Kubernetes and Linux containers (e.g., Docker).
- Related/complementary open-source software platforms and languages (e.g., Scala, Python, Java, Linux).
- Experience with both relational (RDBMS) and non-relational databases.
- Analytical and problem-solving skills applied to big data datasets.
- Experience working on projects with agile/scrum methodologies and high-performing teams.
- Good understanding of access control, data masking, and row access policies.
- Exposure to DevOps methodology.
- Knowledge of data warehousing principles, architecture, and implementation.
Negotiable
London Area, United Kingdom
Undetermined
Hybrid
IT
Not Specified
Job Description: The Senior Data Engineer will develop complex data sources and pipelines for a data platform, primarily using Snowflake and other applications like Azure and Airflow. This role involves collaboration with various teams to align requirements and identify technological gaps, while also ensuring data integrity and reliability through monitoring and automation. The position is contract-based for an initial duration of six months, with a hybrid work structure requiring two days a week in London.
Key Responsibilities:
- Integrate data from multiple on-prem and cloud sources and systems.
- Handle data ingestion, transformation, and consolidation for analysis and reporting.
- Develop data transformation routines to clean, normalize, and aggregate data.
- Apply data processing techniques for complex data structures and prepare data for analysis.
- Implement data de-identification/data masking in line with company standards.
- Monitor data pipelines and systems to detect and resolve issues promptly.
- Develop monitoring tools to automate error handling mechanisms.
- Utilize data quality tools to ensure data accuracy and integrity.
- Create & maintain data pipelines using Airflow & Snowflake.
- Create SQL Stored procedures for complex transformations.
- Understand data requirements and design optimal pipelines.
- Create logical & physical data models to ensure data integrity.
- CI/CD pipeline creation & automation using GIT & GIT Actions.
- Tune and optimize data processes.
Skills Required:
- Bachelor's degree in Computer Science or a related field.
- Proven hands-on experience as a Data Engineer.
- Proficiency in SQL with experience using Window functions and advanced features.
- Excellent communication skills.
- Strong knowledge of Python.
- Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, etc.
- In-depth knowledge of Snowflake architecture and best practices.
- Experience with CI/CD pipelines using Git and Git Actions.
- Knowledge of various data modeling techniques.
- Hands-on experience with developing data pipelines and writing complex SQL queries.
- Experience with Kubernetes and Linux containers.
- Experience with both relational and non-relational databases.
- Analytical and problem-solving skills applied to big data datasets.
- Experience working on projects with agile/scrum methodologies.
- Good understanding of access control and data masking.
- Exposure to DevOps methodology.
- Knowledge of data warehousing principles and architecture.
Salary (Rate): undetermined
City: London
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element of the requirement. They will also need to help identify gaps in resources, technology, or capabilities required, and work with the data engineering team to identify and implement solutions where appropriate.
Work type: Contract
Length: initial 6 months
Work structure: hybrid 2 days a week in London.
Primary Responsibilities:
- Integrate data from multiple on prem and cloud sources and systems.
- Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting.
- Develop data transformation routines to clean, normalize, and aggregate data.
- Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks.
- Implement data de-identification/data masking in line with company standards.
- Monitor data pipelines and data systems to detect and resolve issues promptly.
- Develop monitoring tools to automate error handling mechanisms to ensure data integrity and system reliability.
- Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle.
- Create & maintain data pipelines using Airflow & Snowflake as primary tools
- Create SQL Stored procs to perform complex transformation
- Understand data requirements and design optimal pipelines to fulfil the use-cases
- Creating logical & physical data models to ensure data integrity is maintained
- CI CD pipeline creation & automation using GIT & GIT Actions
- Tuning and optimizing data processes
Qualifications Required Qualifications:
- Bachelor's degree in Computer Science or a related field.
- Proven hands-on experience as a Data Engineer.
- Proficiency in SQL (any flavor), with experience using Window functions and advanced features.
- Excellent communication skills.
- Strong knowledge of Python.
- Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc.
- In-depth knowledge of Snowflake architecture, features, and best practices.
- Experience with CI/CD pipelines using Git and Git Actions.
- Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault.
- Hands-on experience with:
- Developing data pipelines (Snowflake), writing complex SQL queries.
- Building ETL/ELT/data pipelines.
- Kubernetes and Linux containers (e.g., Docker).
- Related/complementary open-source software platforms and languages (e.g., Scala, Python, Java, Linux).
- Experience with both relational (RDBMS) and non-relational databases.
- Analytical and problem-solving skills applied to big data datasets.
- Experience working on projects with agile/scrum methodologies and high-performing teams.
- Good understanding of access control, data masking, and row access policies.
- Exposure to DevOps methodology.
- Knowledge of data warehousing principles, architecture, and implementation.

Create a free account to view the take-home pay for this contract
Sign Up