Data Engineer


Premium Job From Nigel Frank International

Recruiter

Nigel Frank International

Listed on

9th May 2022

Location

New York

Salary/Rate

Upto £127127

Type

Permanent

This job has now expired please search on the home page to find live IT Jobs.

Data EngineerJob Type: Full-Time, PermanentLocation: RemoteSalary: Up to $155K

Responsibilities

Under minimal supervision, the individual will act as a data engineer responsible for the development, customization, and support of data solutions to support analytics needs throughout the Hexagon PPM organization. The individual will be involved in all phases of the software development lifecycle. The individual will provide technical leadership in developing solutions that provide high quality, structured data for use in data exploration and analysis. Responsibilities will include designing, developing, optimizing, and testing large-scale data systems involving telemetry collection, data ingestion, transformation, storage, and analytics processes to support data visualization teams. Technologies will include various big data technologies on the Microsoft Azure platform including big data technologies such as Azure Data Lake, Databricks, and PySpark. The individual should have a strong grasp of principles in software engineering, data modeling, data architecture, and ETL processes to provide quality data and analytics to a variety of consumers.

The individual will automate and manage both streaming and batch data pipelines from various other business systems on-premise and in the cloud such as Dynamics GP, Salesforce.com, and a custom licensing solution. This big data solution will combine those disparate data sources into complex datasets to meet the needs of both internal business users and external customers. Therefore, the individual must possess the initiative to learn a broad set of enterprise systems technologies and patterns and how they apply to various business needs.

The solution will be used internally to meet business strategies and goals by facilitating a better understanding of our customers including financial analysis, license and feature usage of PPM products, and support metrics. The data analysis should drive business performance improvements in areas such as product development and sales. The solution will also be used by customers to understand their usage of PPM products.

Because of the wide range of analytics consumers involved, this role will require good communication skills and the ability to learn the needs of each business area. The end goal will be to produce clear and compelling information to customers and internal stakeholders.

The individual will work in an Agile environment using Scrum processes and XP development techniques. The individual will do regular demos of functionality to a product owner and end users. The individual will support systems used by hundreds of employees and thousands of customers and may interface directly with users to provide quick resolution to critical support issues. Individual will work within a cross-functional team to provide high value solutions to complex problems. Individual will act as a technical mentor and provide technical leadership to the design and maintenance of those solutions.

The individual must be able to work under minimal supervision and possess strong communication skills. The individual must display strong critical thinking skills. The individual must display the ability to understand business needs and find valuable solutions to meet those needs. The individual must be a team player and will be required to work with a team of cross-functional developers.Qualifications

Experience in data engineering, business systems development, software development or another relevant technical field is required. Experience with PySpark, Databricks, Hadoop and/or similar big data technologies is required. Experience with data modeling is required. Experience with SQL Server including developing and optimizing SQL queries is required. Experience with Databricks Delta Lake or other delta processing concepts is highly preferred.

Experience with Tableau or Einstein Analytics is preferred. Experience with Salesforce.com and other cloud technologies is preferred. Experience with a source control management solution is required. Experience with build automation and automated testing is preferred. Experience with Azure DevOps is preferred. Experience with ATDD, BDD and/or TDD is preferred.

Adherence to coding standards, software configuration management, and DevOps management processes including change management, source control, and release management is required. Good documentation skills are expected. Originality, ingenuity, independent judgment and a reasonable degree of self-direction and decision-making are expected.

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: