AWS Cloud Architect - £650 - London - AWS, Terraform, Hadoop


Premium Job From Nigel Frank International

Recruiter

Nigel Frank International

Listed on

31st May 2018

Location

London

Salary/Rate

£600 - £650

Type

Permanent

Start Date

ASAP

This job has now expired please search on the home page to find live IT Jobs.

AWS Cloud Architect - London - AWS, Terraform, Hadoop - £ Negotiable

The successful applicant will have responsibility for the technical architecture, design, implementation and support of the companies AWS cloud platform including security, storage, data ingestion, data integration, big data, NOSQL, Graph database deployment and overall platform optimization.

This start-up business will need to be designed from the top down and built from the bottom up; as such the Cloud Data Architect will be a critical role in the design, development and implementation of the AWS cloud environment. The successful applicant will also work closely with the wider technology team to ensure alignment of requirements and the delivery of a world class cloud environment for the benefit of everyone involved.

Key Technology Skill - AWS, Terraform, Jenkins, Hashicorp (Consul/Packer/Vault), Lambda, Redshift, Hadoop, Apache

Experience

* Experience developing large scale data architecture for a multi-TB traditional relational data warehouse environment

* Experience developing large scale data architecture for a Hadoop environment with at least Hive, Pig, Oozie and Spark

* Experience designing scalable data integration jobs, workflows and orchestration in a multi-TB traditional relational data warehouse and a large scale big data environment

* Experience in designing security, encryption, access control and data masking patterns for large scale batch data ingestion and processing pipelines on AWS (at least 12 months of hands-on experience)

* Experience designing stream ingesting and processing data pipelines leveraging open source technologies like Kafka, Storm, Flink, Flume

* Strong experience working with RDMS databases (e.g. PostgreSQL or SQL Server), managing connection-pools, performance tuning and optimizations

* Large-scale systems software design and development experience, with experience in Unix/Linux

* Demonstrable experience in Agile / DevOps / Continuous Delivery practices within another organisation.

* Must be able to work independently or as part of a team with strong interpersonal skills.

* BS Degree in Computer Science or related technical discipline or equivalent work experience

Preferred Skills:

* Background designing data pipelines leveraging AWS technologies (e.g. AWS data pipelines, Amazon Kinesis, EMR Clusters, AWS Lambda, Redshift)

* Strong working knowledge of architecture, performance and best practices in either Apache Hive or Apache Spark, ideally on EMR Clusters

* Experience in architecting and designing big data solutions with NoSQL and MPP products

* Experience with data governance, specifically developing data quality processes

* 5+ years of technical delivery experience in banking and / or digital businesses

* Good understanding of financial services, banking products and the underlying technology solutions and architectures or/and relevant experience of Digital/Ecommerce platforms

* Excellent analytical skills, technical skills, written and verbal communication skills, planning and organisational skills

* Ability to explain complex technical issues in a way that non-technical stakeholders will understand.

* Highly organised, with the ability to deliver under pressure in a fast-paced environment

* Exposure to the Hashicorp product stack (Terraform, Consul, Packer, Vault), Jenkins, Config Management and other DevOps and CI/CD Tools

Please E-mail - [email protected] // Call - 0203 868 5174

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: