AWS Data Architect


Premium Job From Experis IT

Recruiter

Experis IT

Listed on

5th June 2018

Location

London

Salary/Rate

£700 - £800

Type

Contract

This job has now expired please search on the home page to find live IT Jobs.

Job Description

Role:

Cloud Data Architect

Team:

Digital Enablement

Background

My client's quest to build a better bank starts a new chapter with the build of the Digital Bank. We are offering the right candidates a once in a lifetime opportunity to join a team that will deliver a shock wave through the status quo. Working in a start-up environment, we are looking for people that have the ability to think, question the status quo and act to deliver a paradigm breaking banking experience.

The successful applicant will have responsibility for the technical architecture, design, implementation and support of the financial cloud platform including security, storage, data ingestion, data integration, big data, NOSQL, Graph database deployment and overall platform optimization.

This start-up business will need to be designed from the top down and built from the bottom up; as such, the Cloud Data Architect will be a critical role in the design, development and implementation of the financial cloud environment. The successful applicant will also work closely with the wider technology team to ensure alignment of requirements and the delivery of an excellent cloud environment for the benefit of client.

Key Accountabilities

1 1. Architect and design the key components of a next generation data platform including a fully governed and managed data lake as the single source of truth

2 2. Design governed, managed, secure, scalable and compliant data ingestion and processing pipelines

for collecting millions of metrics per day from client applications and users

1 3. Design logical and physical data models for various modern data storage and analytics engines

2 4. Manage a team of cloud data engineers to deliver code to develop new software products and/or features, manage individual project priorities, deadlines and deliverables

3 5. Ability to augment the data engineering team through hands-on coding if needed

4 6. Design tools, frameworks and dashboards to support data governance initiatives

5 7. Bring external data architecture and design best practices, keeping abreast of new tools and ways of working that may improve the efficiency of the data platform

6 8. Deliver excellent work at speed, document, fail fast, rebuild

Experience

Basic

Experience developing large scale data architecture for a multi-TB traditional relational data warehouse environment

Experience developing large scale data architecture for a Hadoop environment with at least Hive, Pig, Oozie and Spark

Experience designing scalable data integration jobs, workflows and orchestration in a multi-TB

traditional relational data warehouse and a large scale big data environment

Experience in designing security, encryption, access control and data masking patterns for large scale batch data ingestion and processing pipelines on AWS (at least 12 months of hands-on experience)

Experience designing stream ingesting and processing data pipelines leveraging open source

technologies like Kafka, Storm, Flink, Flume

Strong experience working with RDMS databases (e.g. PostgreSQL or SQL Server), managing connection-pools, performance tuning and optimizations

Large-scale systems software design and development experience, with experience in Unix/Linux

Demonstrable experience in Agile / DevOps / Continuous Delivery practices within another organisation.

Must be able to work independently or as part of a team with strong interpersonal skills.

BS Degree in Computer Science or related technical discipline or equivalent work experience

Preferred Skills:

Background designing data pipelines leveraging AWS technologies (e.g. AWS data pipelines, Amazon Kinesis, EMR Clusters, AWS Lambda, Redshift)

Strong working knowledge of architecture, performance and best practices in either Apache Hive or Apache Spark, ideally on EMR Clusters

Experience in architecting and designing big data solutions with NoSQL and MPP products

Experience with data governance, specifically developing data quality processes

5+ years of technical delivery experience in banking and / or digital businesses

Good understanding of financial services, banking products and the underlying technology solutions and architectures or/and relevant experience of Digital/Ecommerce platforms

Excellent analytical skills, technical skills, written and verbal communication skills, planning and organisational skills

Ability to explain complex technical issues in a way that non-technical stakeholders will understand.

Highly organised, with the ability to deliver under pressure in a fast-paced environment

Exposure to the Hashicorp product stack (Terraform, Consul, Packer, Vault), Jenkins, Config

Management and other DevOps and CI/CD Tools

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: