Job Description: The Data Architect role involves designing, developing, and maintaining data pipelines and ETL processes using Snowflake on AWS. The position requires collaboration with various stakeholders to optimize data warehousing solutions and ensure data quality and integrity. Candidates should have extensive experience in data engineering, particularly with Snowflake and AWS technologies. The role emphasizes performance optimization and staying current with industry trends in data engineering and cloud technologies.
Key Responsibilities:
- Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS.
- Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets.
- Collaborate with data analysts, scientists, and other stakeholders to define and fulfill data requirements.
- Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability.
- Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake.
- Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity.
- Stay up-to-date with the latest trends and best practices in data engineering and cloud technologies.
Skills Required:
- Bachelor’s degree in computer science, Engineering, or a related field.
- 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS.
- Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.).
- Hands-on experience with Oracle RDBMS.
- Data Migration experience to Snowflake.
- Experience with AWS services such as S3, Lambda, Redshift, and Glue.
- Strong understanding of data warehousing concepts and data modeling.
- Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions.
- Understanding/hands-on experience in Orchestration solutions such as Airflow.
- Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability.
Salary (Rate): undetermined
City: Basildon
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Responsibilities: Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets. Collaborate with data analysts, scientists, and other stakeholders to define and fulfill data requirements. Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability. Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake. Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity. Stay up-to-date with the latest trends and best practices in data engineering and cloud technologies. Cloud Services such as AWS Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. Proficiency in SQL, Python, and ETL tools ( Streamsets, DBT etc.) Hands on experience with Oracle RDBMS Data Migration experience to Snowflake Experience with AWS services such as S3, Lambda, Redshift, and Glue. Strong understanding of data warehousing concepts and data modeling. Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. Understanding/hands on experience in Orchestration solutions such as Airflow Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability Thank you for your time.
Negotiable
Basildon, England, United Kingdom
Undetermined
Undetermined
IT
Not Specified
Job Description: The Data Architect role involves designing, developing, and maintaining data pipelines and ETL processes using Snowflake on AWS. The position requires collaboration with various stakeholders to optimize data warehousing solutions and ensure data quality and integrity. Candidates should have extensive experience in data engineering, particularly with Snowflake and AWS technologies. The role emphasizes performance optimization and staying current with industry trends in data engineering and cloud technologies.
Key Responsibilities:
- Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS.
- Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets.
- Collaborate with data analysts, scientists, and other stakeholders to define and fulfill data requirements.
- Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability.
- Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake.
- Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity.
- Stay up-to-date with the latest trends and best practices in data engineering and cloud technologies.
Skills Required:
- Bachelor’s degree in computer science, Engineering, or a related field.
- 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS.
- Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.).
- Hands-on experience with Oracle RDBMS.
- Data Migration experience to Snowflake.
- Experience with AWS services such as S3, Lambda, Redshift, and Glue.
- Strong understanding of data warehousing concepts and data modeling.
- Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions.
- Understanding/hands-on experience in Orchestration solutions such as Airflow.
- Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability.
Salary (Rate): undetermined
City: Basildon
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Responsibilities: Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets. Collaborate with data analysts, scientists, and other stakeholders to define and fulfill data requirements. Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability. Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake. Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity. Stay up-to-date with the latest trends and best practices in data engineering and cloud technologies. Cloud Services such as AWS Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. Proficiency in SQL, Python, and ETL tools ( Streamsets, DBT etc.) Hands on experience with Oracle RDBMS Data Migration experience to Snowflake Experience with AWS services such as S3, Lambda, Redshift, and Glue. Strong understanding of data warehousing concepts and data modeling. Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. Understanding/hands on experience in Orchestration solutions such as Airflow Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability Thank you for your time.

Create a free account to view the take-home pay for this contract
Sign Up