Azure Data Engineer - London


Premium Job From BCT Resourcing

Recruiter

BCT Resourcing

Listed on

27th April 2022

Location

London

Salary/Rate

£500 - £600

Type

Permanent

Start Date

2022-04-26 08:56:11

This job has now expired please search on the home page to find live IT Jobs.

Azure Data Engineer - Outside IR35 Contract
Location: London - Hybrid (3 days p/week in London)
Daily Rate: £500 - £600 p/day

I am currently working with a London based, up and coming, Insurance organisation that is looking for a talented Azure Data Engineer to join their Insight team.

The Data Engineer will be working within the technology Engineering team to design and develop data solutions within an Azure cloud native environment. The landscape is a greenfield environment, a new Azure and Databricks based platform will be designed and implemented from the ground up, with strong skills in Azure tooling (ADF), SQL, and Python being a key requirement.

Key Responsibilities:

* Experience in Azure platform architecture and developing data engineering pipeline.
* Experience in SQL and Python (Spark)
* Provide technical assistance in data engineering, analysis, orchestrations and enrichment
* Build relationships within the business, understand our decision making processes and identify opportunities to exploit data to enhance decision making
* Build data solutions/applications to solve complex business problems
* Deploy data pipelines and applications in a production-safe manner, using DevOps best practices to ensure scalability and re-useability.

Experience Required:

* Experience of working as a Data Engineer or similar role.
* Proven track record in Data Engineering and supporting the business to gain true insight from data.
* Strong SQL, Python and Azure data platform experience ADF, Azure Synapse.
* Experience with data tools and languages in a cloud native environment.
* Demonstrable ability to design, implement and use different database structures with particular focus on cloud based data services such as Azure, Databricks, Snowflake.
* Experience in building ETL/ELT data pipelines and use of DevOps (CI/CD) concepts to test, schedule, and deploy to a production environment.
* Desirable experience in parallel computing to process large datasets and to optimise computationally intensive tasks.
* Desirable experience of using MI reporting tools and visualisations packages to utilize the datasets created during the engineering process as: PowerBI/Tableau/Plotly-Dash.

The business has a flat structure and operates a flexible working model. Employees well-being is a top priority to the business.


You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: