Job Description
Data Engineer - Self Service & Analytics Platform
Global Telecommunications Company
£55K - £65K basic
Hybrid - London once a week
We are working on behalf of a global telecommunications business who are looking for Data Engineer to join their Data Democratisation team.
Our client is the UK’s fastest broadband network, the nation’s best-loved mobile brand and, one of the UK's biggest companies too.
As a Data Engineer, you will:
- Leverage innovative GCP technologies and Cloud Spanner to develop a unified identification service that connects millions of customers.
- Create and maintain unified identification data models for analytics, enhancing insights across the entire customer base to improve the customer experience.
- Support the integration of the identification service with third-party SaaS and internal systems, ensuring all processes consistently handle IDs.
- Provide ongoing support for current and future identification services, ensuring compliance with service level agreements (SLAs).
- Troubleshoot and resolve issues related to identification services, delivering timely and effective solutions to maintain system stability.
- Collaborate with stakeholders from various departments to gather requirements and ensure successful solution delivery.
- Work alongside skilled Architects, DevOps Engineers, Data Engineers, Analytic Engineers, and Analysts to meet team objectives and key results.
- Utilize Infrastructure as Code (IaC) principles for efficient cloud infrastructure deployment using tools like Terraform.
- Stay informed about industry trends, technologies, and best practices, continuously improving your skills to enhance the organization’s capabilities.
- Actively participate in code reviews, fostering a culture of high-quality code and knowledge sharing within the team.
- Identify areas for improvement, propose innovative ideas, and contribute to the team’s growth and success.
We’d love to hear from you if you:
- Are focused on delivering business value.
- Have a proactive “I can solve it” attitude.
- Are a collaborative team player with excellent communication skills.
- Possess strong proficiency in Python with proven Data Engineering experience.
- Have solid SQL skills, including analytical functions and ensuring data quality.
- Understand DevOps best practices, including IaC and CI/CD.
- Have production experience in a cloud environment, preferably GCP.
- Have worked with data warehouse engines like BigQuery or Redshift.
- Have experience with key/value stores such as Datastore or DynamoDB.
It would be great if you:
- Have experience working in an Agile environment.
- Are familiar with SOLID principles and have an interest in their application.
- Have practiced Test-Driven Development (TDD).
- Have used Apache Airflow and maintained production pipelines.
- Have experience with Apache Beam or Apache Spark.
- Have worked with Kubernetes.
- Have experience using Terraform.
Technologies Our Client Uses:
- Apache Airflow on Cloud Composer
- Apache Beam on Cloud Dataflow
- DBT (Data Build Tool)
- Cloud Function, Firestore in Datastore mode, Pub/Sub, BigQuery, Cloud Storage, Terraform, GitLab, and other relevant technologies.