Data Engineer


Premium Job From Nigel Frank International

Recruiter

Nigel Frank International

Listed on

9th June 2021

Location

Swindon

Salary/Rate

£50000 - £70000

Type

Permanent

Start Date

ASAP

This job has now expired please search on the home page to find live IT Jobs.

Data Engineer About the jobThis role can be remote, so we are open to applications from home / remote based applicants.Are you looking for us?Our client is Google's largest Premier Partner in Europe, they deliver leading-edge cloud services to some of the world's most exciting brands, in sectors from retail through automotive to finance. This role will expose you to some of the most exciting Cloud transformation and optimisation project opportunities utilising GCP. delivers its services through the Google Cloud Platform (GCP), using some of the world's most exciting new large-scale data tools, such as BigQuery and BigTable.You will be joining a dedicated Data and Analytics team who concentrate on using data to transform the way the world works. As a Data Engineer, you will be working on leading-edge cloud data transformation and optimisation projects for Global Brands. Assisting to architect solutions that meet clients problems.Are we looking for you?As a Data Engineer you will play a key role in the Application Development Team, designing cloud enterprise solutions architecture and helping to build big data solutions for customers by integrating structured,, semi-structured and unstructured data on low latency platforms. You will work to solve complex business problems with the use of Big Data, Auto-ML, ML and visualisation technologies.Training and certification on Google Cloud Platform (GCP) are provided as part of the extensive on- boarding program. The role offers you an exciting long term career with us as a fast-growing market leader Training and certification on Google Cloud Platform (GCP) are provided as part of the extensive on- boarding program. The role offers you an exciting long term career with us as a fast-growing market leader.Skills that come naturally to you…Consult, design and coordinate architecture to modernise infrastructure for performance, scalability, latency and reliabilityKnowledge of and able to consult on various technologies such as database, data warehousing and big data e.g. BigQuery, Oracle, Redshift, Teradata, HadoopArticulate communication skills to explain complex solutions to customers in a clear and concise manner, aligned with the ability to adjust style to a varied audience within an organisation.Strong analytical and design skills around the full end-to-end ETL lifecycle of data pipelines in large enterprise settingsHistory of working with Data warehouse solutions (on-premise & Cloud)Delivered cloud transformation & migration solutions architectureGood experience of SQL on relational & non-relational database (RedShift, BigQuery, MySQL, etc)Practical experience of data curation, cleansing, creation and maintenance of data sets for business purposesExperience in data science modelling, analytics & BI reporting (Tableau, Looker, Periscope, DataStudio)Knowledge and experience of common open source languages for data integration (Python, Go, Node.js, Java)Understanding of industry standard containerisation approaches (Docker, Kubernetes, etc) would be advantageousExperience in building scalable data pipelines using Hadoop Spark clusters would be desirableStrong technical skills, aligned with good analytical skillsProject delivery within Agile methodology and planning processesExperience with System and User Acceptance TestingIt would be great if you have (or are excited to gain)Industry standard containerisation approaches (Docker, Kubernetes, etc)Building scalable data pipelines using Hadoop Spark clustersThe Role (80%)The role is varied so you'll get involved in a wide range of activities but here are the main customer facing things we'll need you to do:Design data storage, movement and orchestration solutionsIdentify, scope and participate in the design and delivery of cloud data platform solutionsDesign and execute a platform modernisation approach for customer data environmentsDesign, coordinate and execute pilots, prototypes or proof of concepts, provide validation on specific scenarios and provide deployment guidanceDocument and share technical best practices/insights with engineering colleagues and the architect communityDesign Data Architectural solutions for structured data technologies and understandunstructured data technologiesCreate and maintain appropriate standards and best practices around Google Cloud SQL, BigQuery and other data technologiesDefine and communicate complex database concepts to technical and non-technical peopleTravel to client sites as appropriate.The Role (20%)Create visualisations, dashboards and MIS reportsKeep informed around upcoming trends and technologies in Big Data and share with the wider team

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: