Big Data Architect


Premium Job From Experis IT

Recruiter

Experis IT

Listed on

22nd November 2019

Location

Leeds

Salary/Rate

£450 - £510

Type

Contract

Start Date

ASAP

This job has now expired please search on the home page to find live IT Jobs.

Role: Big Data Architect

Start Date: ASAP

Location: Leeds

Duration: 4 months

IR35 status: In ScopeWould you like to join a global leader in consulting, technology services and digital transformation?Our client are at the forefront of innovation to address the entire breadth of opportunities in the evolving world of cloud, digital and platforms.As a Big Data Architect you will be able to demonstrate expertise in leading enterprise data architecture and design for large complex businesses.Your credible track record of designing for Microservices, external data ingestion, real-time data analytics, domain-driven master data model, event-driven enterprise data-sharing, authoritative data sources and an external API exchange will be utilized in a continuously improving, challenging and satisfying environment.You'll thrive creating value using large, diverse data volumes, e.g. by using Tableau, Qlik, C3, Hadoop / Spark / SQL / Mongo Atlas/ Python to create modelling features from underlying transactional data.You will be willing to roll up your sleeves periodically to co-create solutions hands-on through modern data science and general programming languages, You will be comfortable working in a cross-functional team (including UX, analysts, statisticians, engineers, product owners, security risk, etc.).Your significant experience in design and implementation along with your excellent technical and analytical skills will be put to the test on a regularly challenging basis.Essential KnowledgeExpert level data modelling, conceptual through to physical, relational, object, analytical and NoSQL.At least 10 years data modelling experience in a software engineering environment.Expert in domain driven design and Microservices.Expert in designing analytical database models from underlying transaction data. Hands own experience in Mongo DB, Hadoop, Tableau, Qlik and SparkScripting/Coding experience (e.g. Bash, Python, Perl)Excellent experience with the Hadoop ecosystems (such as HDFS, YARN and/or Hive)Strong experience with streaming and stream processing frameworks (such as Spark, Storm, Flink, Kafka and/or Kinesis)Good knowledge of at least one of the following programming languages: Python, Scala, Go, Kotlin, JavaExperience with NoSQL databases (such as HBase, Cassandra and/or MongoDB)Experience with public cloud-based technologies (such as Kubernetes, AWS, GCP, Azure, and/or Open stack)Expert level in the creation and maintenance of enterprise data artefacts.Highly proficient in at least one query language for each of the following: relational; analytical; NoSQL.Proven innovator with proficient software engineering skills to prototype relational; analytical; NoSQL solutions.Highly motivated with experience of selecting and working with data and data tools.Experience of canonical modelling.Capable of engaging senior stakeholders.Capable of developing and maintaining strong working relationships within the organisation and third party suppliers.Able to manage their time effectively and work proactively across projects as well as BAU tasks.Written and verbal skills to enable them to communicate complex data and information problems and solutions. Experience of developing data strategies, policies and standards.Have experience of working within public sector or a comparable organisation within the last 3 yearsTOGAF Practitioner. Please submit CVs today!

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: