Skip to content
GMTCETEDTCDTPDT
en

Data Engineer

HL11882

Up to £70k + Bonus per annum

Central London (Hybrid, 2 days a week)

Permanent

Apply now Apply via LinkedIn

Finatal is currently exclusively partnered with an innovative business undergoing a major data transformation to build a high-performing Data function. As part of this transformation, they are now seeking a talented Data Engineer to model and build scalable data pipelines and own the DBT tool and transformations. You will have the opportunity to join a forward-thinking team at the beginning of its journey, working with cutting-edge technologies and frameworks shaping data processes and collaborating on the implementation of DataOps methodologies while leveraging AI tools to enhance efficiency.

Role:

  • Reporting directly to the Data Architect (Data Platform Manager), the Data Engineer will ensure that data is clean, reliable, and delivered according to best practices. Design, develop, and maintain scalable data pipelines from diverse sources.
  • Use DBT (Data Build Tool) to transform data through various layers, ensuring data quality and integrity.
  • Write custom connectors using Python and leverage out-of-the-box data-loading tools.
  • Embed real-time, automated data quality checks, validations, and alerts across data pipelines.
  • Manage CI/CD workflows with GitHub, GitHub Actions, and Kubernetes
  • Implement and review Role-Based Access Control (RBAC) policies.
  • Champion best practices in data engineering and DataOps methodologies
  • Identify and recommend improvements to existing processes and systems.

Requirements:

  • Advanced expertise in SQL across multiple platforms (e.g., Snowflake, MSSQL, Databricks)
  • Exceptional technical ability with proficiency with DBT and Python libraries such as PySpark, Pandas, and Snowpark. Experience with cloud-based data warehousing, particularly Snowflake, and ELT tools like Fivetran.
  • Strong knowledge of API integration and JSON transformation
  • Experience with event streaming (e.g., Azure Service Bus, Kafka, RabbitMQ)
  • Familiarity with orchestration platforms (e.g., Apache Airflow, Dagster, Prefect)
  • Proficient with CI/CD principles and Git-based workflows
  • Experience with cloud services, particularly Azure, and containerisation tools like Kubernetes (nice to have)
  • Understanding of data modeling approaches such as 3NF, Star, Data Vault, etc.

If this opportunity is of interest, or you know someone who would be interested, please send your CV and contact details to harriet.lander@finatal.com