About the job
This role can be remote, so we are open to applications from home / remote based applicants.
Are you looking for us?
Our client is Google's largest Premier Partner in Europe, they deliver leading-edge cloud services to some of the world's most exciting brands, in sectors from retail through automotive to finance. This role will expose you to some of the most exciting Cloud transformation and optimisation project opportunities utilising GCP. delivers its services through the Google Cloud Platform (GCP), using some of the world's most exciting new large-scale data tools, such as BigQuery and BigTable.
You will be joining a dedicated Data and Analytics team who concentrate on using data to transform the way the world works. As a Data Engineer, you will be working on leading-edge cloud data transformation and optimisation projects for Global Brands. Assisting to architect solutions that meet clients problems.
Are we looking for you?
As a Data Engineer you will play a key role in the Application Development Team, designing cloud enterprise solutions architecture and helping to build big data solutions for customers by integrating structured,, semi-structured and unstructured data on low latency platforms. You will work to solve complex business problems with the use of Big Data, Auto-ML, ML and visualisation technologies.
Training and certification on Google Cloud Platform (GCP) are provided as part of the extensive on- boarding program. The role offers you an exciting long term career with us as a fast-growing market leader Training and certification on Google Cloud Platform (GCP) are provided as part of the extensive on- boarding program. The role offers you an exciting long term career with us as a fast-growing market leader.
Skills that come naturally to you…
- Consult, design and coordinate architecture to modernise infrastructure for performance, scalability, latency and reliability
- Knowledge of and able to consult on various technologies such as database, data warehousing and big data e.g. BigQuery, Oracle, Redshift, Teradata, Hadoop
- Articulate communication skills to explain complex solutions to customers in a clear and concise manner, aligned with the ability to adjust style to a varied audience within an organisation.
- Strong analytical and design skills around the full end-to-end ETL lifecycle of data pipelines in large enterprise settings
- History of working with Data warehouse solutions (on-premise & Cloud)
- Delivered cloud transformation & migration solutions architecture
- Good experience of SQL on relational & non-relational database (RedShift, BigQuery, MySQL, etc)
- Practical experience of data curation, cleansing, creation and maintenance of data sets for business purposes
- Experience in data science modelling, analytics & BI reporting (Tableau, Looker, Periscope, DataStudio)
- Knowledge and experience of common open source languages for data integration (Python, Go, Node.js, Java)
- Understanding of industry standard containerisation approaches (Docker, Kubernetes, etc) would be advantageous
- Experience in building scalable data pipelines using Hadoop Spark clusters would be desirable
- Strong technical skills, aligned with good analytical skills
- Project delivery within Agile methodology and planning processes
- Experience with System and User Acceptance Testing
It would be great if you have (or are excited to gain)
- Industry standard containerisation approaches (Docker, Kubernetes, etc)
- Building scalable data pipelines using Hadoop Spark clusters
The Role (80%)
The role is varied so you'll get involved in a wide range of activities but here are the main customer facing things we'll need you to do:
- Design data storage, movement and orchestration solutions
- Identify, scope and participate in the design and delivery of cloud data platform solutions
- Design and execute a platform modernisation approach for customer data environments
- Design, coordinate and execute pilots, prototypes or proof of concepts, provide validation on specific scenarios and provide deployment guidance
- Document and share technical best practices/insights with engineering colleagues and the architect community
- Design Data Architectural solutions for structured data technologies and understand
- unstructured data technologies
- Create and maintain appropriate standards and best practices around Google Cloud SQL, BigQuery and other data technologies
- Define and communicate complex database concepts to technical and non-technical people
- Travel to client sites as appropriate.
The Role (20%)
- Create visualisations, dashboards and MIS reports
- Keep informed around upcoming trends and technologies in Big Data and share with the wider team