Data Specialist (Azure)


Premium Job From SidTech

Recruiter

SidTech

Listed on

21st December 2020

Location

United Kingdom

Salary/Rate

£300 - £350

Type

Contract

Start Date

2020-12-21 00:00:00

This job has now expired please search on the home page to find live IT Jobs.

Job title-Data Specialist (Azure)Location-Remote (anywhere in Europe except UK)Nationality-Any EU NationalityClient-Renowned IT Consulting firmContract-3-6 months (Extendable)Day rate-£250-£350 (Depending on experience and location)Job specs                     Ability to work in ambiguous situations with unstructured problems and anticipate potential issues/risks
                     Demonstrated experience in building data pipelines in data analytics implementations such as Data Lake and Data Warehouse
                     At least 2 instances of end-to-end implementation of data processing pipeline
                     Experience configuring or developing custom code components for data ingestion, data processing and data provisioning, using Big data & distributed computing platforms such as Hadoop/Spark, and Cloud platforms such as AWS or Azure.
                     Hands-on-experience developing enterprise solutions using designing and building frameworks, enterprise patterns, database design and development in 2 or more of the following areas:
o End-to-end implementation of Cloud data engineering solution
? AWS (EC2, S3, EMR, Spectrum, Dynamo DB, RDS, Redshift, Glue, Kinesis) /
? Azure (Azure SQL DW, Azure Data factory, HDInsight, Cosmos DB, PostgreSQL, SQL on Azure)
o End-to-end implementation of Big data solution on Cloudera/Hortonworks/MapR ecosystem
? Real-time solution using Spark streaming, Kafka/Apache pulsar/Kinesis
? Distributed compute solution (Spark/Storm/Hive/Impala)
? Distributed storage and NoSQL storage (Cassandra, Mongo DB, Datastax)
o Batch solution and distributed computing using ETL/ELT (SSIS/Informatica/Talend/Spark SQL/Spark Data frame/AWS Glue/ADF)
o DW-BI (MSBI, Oracle, Teradata), Data modeling, performance tuning, memory optimization/DB partitioning
o Frameworks, reusable components, accelerators, CI/CD automation
o Languages (Python, Scala)
                     Proficiency in data modelling, for both structured and unstructured data, for various layers of storage
                     Ability to collaborate closely with business analysts, architects and client stake holders to create technical specifications
                     Ensure quality of code components delivered by employing unit testing and test automation techniques including CI in DevOps environments.
                     Ability to profile data, assess data quality in the context of business rules, and incorporate validation and certification mechanism to ensure data quality
                     Ability to review technical deliverables, mentor and drive technical teams to deliver quality technical deliverables.
                     Understand system Architecture and provide component level design specifications, both high level and low level design
                     Experience in building ground-up Data lake solutions
                     Provide support in building RFP
                     Data governance using Apache atlas, Falcon, Ranger, Erwin, Metadata manager
                     Understanding of Design patterns (Lambda architecture/Data lake/micro services)

You are currently using an outdated browser.

Please consider using a modern browser such as one listed below: