Data Architect (inside IR35)


Premium Job From SidTech

Recruiter

SidTech

Listed on

20th November 2020

Location

Osterley

Salary/Rate

£500 - £500

Type

Contract

Start Date

2020-11-20 00:00:00

This job has now expired please search on the home page to find live IT Jobs.


Job Title: Data Architect
PAYE or limited company ( Mandatory ) PAYE Location: Osterley, West London Department/Practice: Technology Job Purpose and primary objectives: Data Architect Key responsibilities (please specify if the position is an individual one or part of a team): Collaborate with stakeholders to devise a data strategy that addresses Sky’s data needs;                      Build an inventory of data needed to implement the architecture;                      Create a fluid, end-to-end vision for how data will flow through the organization;                      Develop data models for a data warehouse, data marts and operational data stores;                      Design analytical business views that incorporate a simpler and more natural presentation of data to enable self-service analytics and reporting;                      Implement measures to improve data accuracy and accessibility;                      Develop Data Quality checks for source and target data sets;                      Document and publish Metadata and table designs to facilitate data adoption;                      Perform SQL and ETL tuning as necessary;                      Design new tools with a focus on direct data ingestion from devices for IoT, faster feedback loops, real time data processing, machine learning, segmentation, personalisation and decision support;                      Identify and evaluate data ingestion and management technologies that can be leveraged to industrialise and automate data ingestion and processing;                      Research new opportunities for data acquisition;                      Ensure data protection measures including anonymisation, access controls, PCI DSS compliance and EU GDPR compliance are adhered to;                      Constantly monitor, refine and report on the performance of data management systems; Technical Leadership You will work with the Lead Architect Data Platform in articulating and evangelising the overall mission & vision of the department. You will support delivery teams with aligning to department technical strategy and technical governance processes. You will achieve this by:                      Providing expertise on open source tools and systems, and thought leadership around agile development practices;                      Identifying the key technologies, components and practices that deliver value;                      Evangelising strategic technologies and best practices to help spread acceptance within the delivery teams;                      Supporting the definition of guidelines for architecture and solutions design within the department and helping projects to deliver within those guidelines;                      Participating as subject matter expert on internal working groups and forums as requested;                      Maintaining your own personal skills and knowledge in the field of data architecture and design. Key Skills/Knowledge: Hands on, demonstrable, experience of defining data architecture and strategic direction to deliver Data Platform capabilities to support BI/Analytics/Reporting, Personalisation or Advanced Analytics (AI/ML) use cases.                      Hands on, demonstrable, experience with data warehousing, data mart design, and data lake technologies;                      Have proven experience leveraging Big Data offerings on Google Cloud Platform (GCP) like BigQuery, Dataflow, Dataproc, Dataprep, Pub/Sub, DLP etc. or their functional equivalents on Amazon Web Services or Microsoft Azure;                      Appreciation of architectural principles to build systems that support very high concurrency, are highly available, and highly resilient in the face of dependent component failures;                      Understanding of machine learning for garnering better insights from data;                      Demonstrable experience designing dashboards/reports using Tableau, QlikView, Power BI etc.;                      Demonstrable experience of defining and delivering solutions using Lambda and Kappa architectures, with emphasis on Apache Kafka based streaming solutions;                      Good understanding of Apache Hadoop ecosystem and tools such as HBase, HDFS, Spark, Hive, Pig, Lucene/Solr, Flume;                      Ability to anticipate business needs and translate them into technical solutions;                      Ability to conceptualise and design platforms capable of supporting multiple systems or applications;                      Ability to apply modelling techniques as needed to data, logical and physical architectures;                      Strong analysis and problem-solving skills;                      Good consultancy and influencing skills, working within strong technical environments;                      Good written and verbal communication skills and the ability to work in a global matrix organisation;                      Adept at translating complex technical concepts into meaningful recommendations;                      Excellent planning and organisational skills; Experience required: Hands on, demonstrable, experience of defining data architecture and strategic direction to deliver Data Platform capabilities to support BI/Analytics/Reporting, Personalisation or Advanced Analytics (AI/ML) use cases.                      Hands on, demonstrable, experience with data warehousing, data mart design, and data lake technologies;                      Have proven experience leveraging Big Data offerings on Google Cloud Platform (GCP) like BigQuery, Dataflow, Dataproc, Dataprep, Pub/Sub, DLP etc. or their functional equivalents on Amazon Web Services or Microsoft Azure;                      Appreciation of architectural principles to build systems that support very high concurrency, are highly available, and highly resilient in the face of dependent component failures;                      Understanding of machine learning for garnering better insights from data;                      Demonstrable experience designing dashboards/reports using Tableau, QlikView, Power BI etc.;                      Demonstrable experience of defining and delivering solutions using Lambda and Kappa architectures, with emphasis on Apache Kafka based streaming solutions;                      Good understanding of Apache Hadoop ecosystem and tools such as HBase, HDFS, Spark, Hive, Pig, Lucene/Solr, Flume;                      Ability to anticipate business needs and translate them into technical solutions;                      Ability to conceptualise and design platforms capable of supporting multiple systems or applications;                      Ability to apply modelling techniques as needed to data, logical and physical architectures;                      Strong analysis and problem-solving skills;                      Good consultancy and influencing skills, working within strong technical environments;                      Good written and verbal communication skills and the ability to work in a global matrix organisation;                      Adept at translating complex technical concepts into meaningful recommendations;                      Excellent planning and organisational skills; Duration of the Assignment:  6 months (extendable based on performance)

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: