Online since 1999 | 10,996 IT Jobs Live NOW

AWS Data Engineer

Premium Job From Nigel Frank International
Recruiter: Nigel Frank International
Listed on: 30th November 2021
Location: Edinburgh
Salary/Rate: £550 - £630
Type: Contract
Start Date: January

This job has now expired please search on the home page to find live IT Jobs.

I am working with a consultancy based in Edinburgh that is a specialist provider of Data, Business Intelligence and Analytics learning solutions and experienced resources. Their vision is to support sound decision-making by empowering people to drive value through data and analytics. Most of the core work is all with Financial services, and this role is no different.

Role - AWS Data Engineer

Location - Fully Remote

Rate - £500-£630 (depending on experience)

Inside IR35

Start Date - January

The Role

Join my client as a Data Engineer and develop and design data warehouse solutions on AWS Cloud using AWS Tools (CLI, Glue etc.) and Big Data Tools (Pyspark, Presto, Airflow etc.) to build Credit Risk reporting capabilities (currently in OBIEE, Oracle, Informatica, SAS etc).

  • Support the Cloud adoption strategic decisions through pilots, improving continuous delivery, infrastructure as code and serverless architecture.
  • Design, develop, test, document, and optimise data integration and data quality processes for the data warehouse, operational data stores, and other systems.
  • Ensure adherence to end client Standards and Design Principles, detailed specifications and documentation; produce the designs, work with Supplier partner resources to build and implement the cloud-native solution in compliance with Technology standards and requirements.
  • Analyse requirements, perform impact analysis. The person will be contributing to the creation of High-Level Design and ensure the same is represented, reviewed and approved in various Technology forums.
  • Create/Review Application / Component design, build and unit test the code; provide accurate status on deliverable and identify risks/issues & communicate; carry out Peer Reviews to ensure that quality goals are met around project deliverable.
  • Support implementation activities; enable technical knowledge sharing across the team; work with suppliers/vendors on designated areas;
  • Contribute to simplify the process in place to maximise the output; develop good knowledge level in the wholesale credit risk applications and tool sets; seek opportunities to develop and develop utilities and tools to improve productivity and efficiency; share Knowledge gained through working in a specific area of the Platform among the team.

The Skills you'll need

In order to be successful in this role, you must have the following skills & experience:

  • Around 3-6 years of software development experience in developing data pipelines, SQL, enterprise applications etc
  • Must have experience with Big Data Tools; Pyspark, Presto (Athena), Airflow etc.
  • Must have experience with Python design and development skills for data pipeline
  • Strong programming skills in Python
  • Knowledge of SAS base code is beneficial
  • Understanding data architecture, data integration and data management processes would be a plus.
  • Good Knowledge of DevOps Tools
  • Ability to learn and adapt continuously in a fast-moving environment.
  • Excellent written and verbal communication with strong analytical skills.

A great opportunity for someone to come and join a long term project for a leading company in the financial services industry. My client is looking to interview asap, therefore if you are looking to secure a new contract before Christmas then please get in touch!!

Kind Regards