Chief Big Data Engineer - Hadoop
||Lloyds Banking Group
||Manchester, London, Bristol
||£60,000 - £92,800
||Lloyds Banking Group
Lloyds Banking Group is the UK’s leading digital bank and biggest mobile bank with over 13.5 million active online customers across our three main brands.
We’ve placed further digitisation at the heart of our strategy to become the bank of the future backed with an investment of £3bn over three years to realise this ambition in our people, platforms and data capabilities.
The simplification and modernisation of our enterprise data landscape underpins a huge amount of this transformation.
We’re evolving from using fragmented data sitting on legacy systems to building a data lake that will enable holistic customer views, better fraud detection, better marketing and better customer outcomes.
At the hub of this evolution is our Information Management Directorate - responsible for mastering and supplying all the customer data needed to meet the ever-more time-critical nature of banking.
They’re using the power of Hortonworks ‘Hadoop’ ecosystems to make this data lake and all the benefits it entails a reality.
Together we’ll make it possible…
As one of our Chief Engineers on this key platform you’d sit within in a niche team and hold one of our most senior "hands-on” engineering leadership roles.
Here’s where you’ll make a difference:
• You’d be building and running highly productive Hadoop engineering teams instilled with the ethos of delivering quality big data solutions.
• You’ll get to establish effective working relationships with other teams across IT, the Group and offshore partners with a focus on delivering business value through sound software engineering methods and principles.
• You’ll drive the overarching Engineering strategy for the platform (tools, coding standards, quality controls etc) and set the direction for all software engineers within your area - ranging from a handful of feature teams to a £50m change portfolio.
• You’d build up a cutting-edge 'guild' of Software Engineers/Big Data SMEs - mentoring, nurturing and developing them whilst crafting opportunities for learning and growth.
You’ll need to have a background that covers:
• Big Data engineering with knowledge of agile development practices, TDD/BDD, automated builds, continuous integration, code quality metrics and Software Engineering oversight.
• Previous Engineering leadership experience, leading a diverse portfolio of activity with the ability to act independently and take the initiative.
• Previous experience building out the capabilities of Hadoop and broadening the functionality of the various toolsets with one eye on new and emerging tools to capitalise on.
• Leading teams of Engineers and SMEs and getting the very best out of their potential.
• The capability to architect highly scalable distributed systems using open source tools and big data technologies to build products, applications and solutions that perform.
And technically you ought to have expertise across a good number of the following:
• Running Hadoop clusters (MapR/Cloudera, ideally Hortonworks) and ecosystems at an enterprise level
• Hadoop applications across Spark, Hbase, Kafka, Storm, Hive
• Large scale production databases (e.g. Oracle, DB2, SQL Server, Teradata etc)
• Java (J2EE), Scala and/or Python
• Ideally experience with SQL, NoSQL (Cassandra), relational database design and methods for effectively retrieving the data.
• Preferably experience designing end to end ETL pipelines using modern ETL and CDC tools such as Kafka
And if you have experience of Big Data Modelling and management systems for data Curation and Consumption then this would be a bonus.
We have a preference for Financial Services backgrounds but other relevant industry experience could be similarly useful, especially if regulated.
But it will be your people skills and real passion for developing engineers within strong team cultures that will set you apart from the rest of the field.
What can we offer you in return?
You'll enjoy an energising environment where you’ll see our real passion for diversity and equal opportunity. We want to ensure our Engineering colleagues get true career opportunities as well as represent the communities we serve...
And a package that also includes:
• A 20% performance related bonus
• A generous pension contribution
• 30 days holiday plus bank holidays
• Private Health cover
• 4% Flexible benefit cash pot you can shape to fit your lifestyle
• Car/allowance of £4700
• Share schemes including free shares
• Discounted products and discounted high street shopping
So if joining us at a critical and exciting time in our data future appeals and you have the Hadoop Engineering Leadership skills we’re seeking then we’d love to hear from you…
To find out more and to apply, please click the APPLY button.
Closing date: 28th September 2018