Salary: Up to £90,000 per year
Jefferson Frank are proudly working with a very well established client based in London. The clients Chief Data Office is leading the transformation for creating a data-driven organisation to use data in every decision and customer experience to obsessively make insurance much easier and better value for customers. Chief Data Office is working closely with the businesses to build a new large-scale, cloud hosted, secure, and consolidated data and analytics platform to enable our users to interact with all their data from a single trusted platform. The platform leverages the economics of big data, cloud elasticity, Machine Learning (ML)/Artificial Intelligence (AI) automation, and permissioned data sharing to turn information into business insights and address business and operational challenges.
- Work as part of the Chief Data Office to ensure our data models and architectural patterns are aligned to user and business requirements.
- Design and lead the implementation of our new core data platforms to help with the adoption of modern Data solutions within Direct Line Group.
- Champion and consult with other teams within Direct Line to advocate the Chief Data Office function and the architectural solutions being generated.
- Work closely with the data engineering team to design and implement enterprise-grade pipelines for data.
- Experience with data modelling (star, snowflake, etc schemas) and performance optimisation of data tables.
- Experience with working with internal customers to understand their use cases and ensure architecture conforms to their needs.
- Experience with both analytical and transactional processing databases
- Experience building enterprise grade data pipelines (ETL and ELT)
- Experience with Cloud-native data services, such as AWS Athena, Redshift, Kinesis/Managed Kafka, DynamoDB, Glue, Lambda, S3
- Experience building and championing adoption of common architectural patterns and driving their adoption
- Solid understanding of Enterprise patterns and applying best practices when integrating various inputs and outputs together at scale.
- Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI)
- Decent knowledge of NoSQL and Big Data tools (Hadoop, Hive, MongoDB, DynamoDB, Presto)
- Understanding of DevOps principles, tools, and the intersection with cloud architecture.
- Experience on an Agile Environment, familiarity with Jira, Confluence and Git.
- Good understanding of the principles of data management, process and delivery.
- Experience interlocking with a Data Governance programme of work
- Any experience with Data Architecture around GDPR compliance is a nice-to-have
- Understanding of insurance value / supply chain.
If this role genuinely interests you please contact Sam on [Telephone number removed] or email Click here to contact this recruiter