Online since 1999 | 7,137 IT Jobs Live NOW

Data Engineer

Premium Job From City Football
Recruiter: City Football
Listed on: 14th December 2020
Location: Manchester
Salary Notes: Competitive + Benefits
Type: Permanent
Start Date: ASAP

This job has now expired please search on the home page to find live IT Jobs.

Purpose:

We have an exciting opportunity to join City Football Group as a Data Engineer in Manchester on a full-time permanent basis. 

Data engineers implement methods to improve data reliability and quality. They combine raw information from different sources to create consistent and machine-readable formats. They do this by developing, maintaining, and testing infrastructures for data generation, data acquisition and systems integration. Data engineers work closely with data scientists and are responsible architecting solutions that enable data extraction, transformation for predictive or prescriptive modelling.

We are looking for an experienced Data Engineer to join our growing team of experts. The candidate will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, architects, analysts, and data scientists and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. 

Acountabilities with key outcomes:

  • Build data systems and pipeline and perform systems integrations
  • Conduct complex data analysis and report on results
  • Build algorithms and prototypes
  • Prepare data for prescriptive and predictive modeling
  • Develop analytical tools and programs
  • Apply DevOps practices such as CI/CD, infrastructure and workflow automation

Knowledge, Skills, and Experience:

Essential

  • Experience with big data tools: Hadoop, Spark, Kafka, Databricks etc.
  • Experience with relational SQL and NoSQL databases, including Postgres, Azure SQL, Cosmos DB etc.
  • Experience with Cloud Computing architecture and services
  • Experience with data pipeline and workflow management tools such as Azure data factory, AWS data pipelines, Apache Airflow, etc.
  • Experience with Azure, AWS or GCP cloud services or equivalent.
  • Experience with stream-processing systems such as  Spark-Streaming, etc.
  • Experience with Python
  • Demonstratable experience building processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing, and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data' data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • History of working effectively in a small team environment. A strong team player and ability to troubleshoot complex problem

Desirable

  • Microsoft Certified Azure Data Engineer, Google Cloud Certified Professional Data Engineer, Amazon Web Services (AWS) Certified Big Data Specialty, or equivalent
  • Databricks certified developer for Apache Spark
  • Databricks certified associate ML practitioner
  • Computer science degree or similar
  • Experience with Java, or C
  • DevOps experience

Job impact/influence measures:

Delivery of activities related to a specific area of work with CFG, including technical/para-professional support and operational delivery

Operational/analytical roles, with specific knowledge/expertise primarily developed through some specialised training.

Decision-making Authority:

The focus will be on managing processes and activities against agreed plans, co-ordinating activities with colleagues. Up to quarterly planning horizons.

Accountable for the delivery of specific, time-bound outcomes - working towards agree/recognised/standardised outputs.