Data Engineer - Big Data, AWS, ETL Tools.

Data Engineer - Big Data, AWS, ETL Tools.

You have experience with client projects and in handling vast amounts of data – working on database design and development, data integration and ingestion, designing ETL architectures using a variety of ETL tools and techniques. You are someone with a drive to implement the best possible solutions for clients and work closely with a highly skilled Data Science team.  Lead on projects from a data engineering perspective, working with our clients to model their data landscape, obtain data extracts and define secure data exchange approaches

?      Plan and execute secure, good practice data integration strategies and approaches

?      Acquire, ingest, and process data from multiple sources and systems into Big Data platforms

?      Create and manage data environments in the Cloud

?      Collaborate with our data scientists to map data fields to hypotheses and curate, wrangle, and prepare data for use in their advanced analytical models

?      Have a strong understanding of Information Security principles to ensure compliant handling and management of client data

?      This is a fantastic opportunity to be involved in end-to-end data management for cutting edge Advanced Analytics and Data Science

Qualifications:

?      Commercial experience leading on client-facing projects, including working in close-knit teams

?      2+ years of experience and interest in Big Data technologies (Hadoop / Spark / NoSQL DBs)

?      2+ years of experience working on projects within the cloud ideally AWS or Azure

?      2+ years of experience working with streaming architectures and patterns like Kafka, Kinesis, Flink, or Confluent

?      Experience with open source tools like Apache Airflow and Griffin

?      Experience with DevOps and DataOps patterns and tools like Jenkins, Kubernetes, Docker, and Terraform

?      Data Warehousing experience with cloud products like Snowflake, Azure DW, or Redshift

?      Experience building operational ETL data pipelines across a number of sources, and constructing relational and dimensional data models

?      Experience with one or more ETL/ELT tools like Talend, Matillion, FiveTran, or Alooma

?      Experience building automated data quality and testing into data pipelines

?      Experience with AI, NLP, Machine Learning, etc. is a plus

?      Strong development background with experience in at least two scripting, object oriented or functional programming language, etc. SQL, Python, Java, Scala, C#, R

?      Experience working on lively projects and a consulting setting, often working on different and multiple projects at the same time

?      Excellent interpersonal skills when interacting with clients in a clear, timely, and professional manner.

?      A deep personal motivation to always produce outstanding work for your clients and colleagues

?      Excel in team collaboration and working with others from diverse skill-sets and backgrounds

要查看或添加评论,请登录

社区洞察

其他会员也浏览了