Senior DevOps Engineer
Recruiter
Listed on
Location
Type
This job has now expired please search on the home page to find live IT Jobs.
You will be responsible for the configuration and on-going management of our client's cloud PaaS/IaaS, working with Big Data platforms such as Hortonworks based on Hadoop clusters configured on cloud data centres such as those of Azure/AWS. You will ensure availability of the platform to services teams and their service users. You will also carry out continuous improvements to platforms, including system development, processes design and automation and diagnostics improvements. Assurance of platform security, privacy and resilience will be your greatest concern.
Key Responsibilities
Collaborating with teams who are provisioning infrastructure in a DevOps environment
Ensure principles of privacy, security and resilience are assessed and designed into solutions
Responsible for implementation and on-going administration of Hadoop infrastructure
Building and supporting the technical stack
Evaluating technology for use in the stack
Integrating technology to provide an end to end architecture
Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments
Working with delivery teams to setup new Hadoop users, including setting up Linux users, Kerberos principles and testing HDFS and Hive access for new users
Monitor and maintain Hadoop ecosystem connectivity and security to guarantee confidentiality, integrity and availability
Working with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
Specific Skills
Agile software engineering practices such as CI/CT/CD
Experience with architecture/engineering of cloud based distribution systems (such as Azure)
Strong knowledge of Azure and its offerings and the Hadoop ecosystem using Hortonworks (preferred) or Cloudera
Cloudbreak, Hive, Ambari, Sqoop, Oozie, SPARK, Atlas, Ranger, HBase, HDFS, YARN and ELK
Hands on experience connecting cloud infrastructure to on-prem networks
VPNs, tunnelling, AD connectors, routing
Knowledge of network protocols such as TCP, UDP, HTTP/HTTPS, SSL/TLS, and API's
Authentication technologies. i.e. LDAP, OAuth, 2FA, SAML, and SSO via Kerberos
Bash, Python, Java or Ruby
Experienced with security controls
VPNs, encryption & key/certificate management, endpoint protection, virtual firewalls/ACLs/NSGs, setting up bastion nodes, etc
Background in distributed systems
databases, security, networking & load balancing, monitoring, scripting, automation
A deep understanding of distributed system design and dependency management
Operational expertise
troubleshooting skills, understanding of system capacity/bottlenecks, basics of memory, CPU, OS, storage, and networks
A strong grasp of protective monitoring tools, approach and implementation
Linux system administration (configuration, installs, automation, and monitoring)
Experience automating infrastructure and software delivery to production
Familiarity with open source configuration management and deployment tools such as Ansible, Puppet and Terraform