GCP Data Engineer - Contract


Premium Job From Nigel Frank International

Recruiter

Nigel Frank International

Listed on

8th April 2021

Location

London

Salary/Rate

£450 - £550

Type

Contract

Start Date

ASAP

This job has now expired please search on the home page to find live IT Jobs.

Job DescriptionI am working with a hugely exciting & fast growing Google Cloud partner who are recruiting for a data engineer to support their client projects across the UK. This role will be working with a range of projects with some hugely exciting clients. The role will be within IR35.

Role & ResponsibilitiesDesign data storage, movement and orchestration solutionsIdentify, scope and participate in the design and delivery of cloud data platform solutionsDesign and execute a platform modernisation approach for customersdata environmentsDesign, coordinate and execute pilots, prototypes or proof of concepts, provide validation on specific scenarios and provide deployment guidanceDocument and share technical best practices/insights with engineering colleagues and the architect communityDesign Data Architectural solutions for structured data technologies and understandunstructured data technologiesCreate and maintain appropriate standards and best practices around Google Cloud SQL, BigQuery and other data technologiesDefine and communicate complex database concepts to technical and non-technical peopleTravel to client sites as appropriate.Create visualisations, dashboards and MIS reportsKeep informed around upcoming trends and technologies in Big Data and share with the wider team

Skills & QualificationsConsult, design and coordinate architecture to modernise infrastructure for performance, scalability, latency and reliabilityKnowledge of and able to consult on various technologies such as database, data warehousing and big data e.g. BigQuery, Oracle, Redshift, Teradata, HadoopArticulate communication skills to explain complex solutions to customers in a clear and concise manner, aligned with the ability to adjust style to a varied audience within an organisation.Strong analytical and design skills around the full end-to-end ETL lifecycle of data pipelines in large enterprise settingsHistory of working with Data warehouse solutions (on-premise & Cloud)Delivered cloud transformation & migration solutions architectureGood experience of SQL on relational & non-relational database (RedShift, BigQuery, MySQL, etc)Practical experience of data curation, cleansing, creation and maintenance of data sets for business purposesExperience in data science modelling, analytics & BI reporting (Tableau, Looker, Periscope, DataStudio)Knowledge and experience of common open source languages for data integration (Python, Go, Node.js, Java)Understanding of industry standard containerisation approaches (Docker, Kubernetes, etc) would be advantageousExperience in building scalable data pipelines using Hadoop Spark clusters would be desirableStrong technical skills, aligned with good analytical skillsProject delivery within Agile methodology and planning processesExperience with System and User Acceptance Testing

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: