Sr. Data Engineer - Remote

Role : Sr. Data Engineer

Skills : Spark(PySpark), SQL, FiveTran, Airbyte, HVR, Apache flink, ETL, ADLS.

Location : Maryland/ Remote?

?

Job Responsibilities/KRAs:

Design, develop and maintain ETL processes using Spark (PySpark) to integrate data as Iceberg on ADLS from multiple source systems

Petabyte size database migration to ADLS on Iceberg tables

Real-time data processing from Azure Service Bus using Spark streaming/airbyte/flink

Optimize ETL workflows to ensure efficient data processing and loading.

Develop scripts to automate data processing and loading tasks.

Implement data quality checks and validation processes within ETL workflows.

Understanding business requirements/scope of projects, create ETL code as per business logic/process; be able to?provide estimation for the tasks as required with supporting data points.

Ensure data governance policies are adhered to, including data lineage and metadata management.

Provide support for data-related issues and troubleshoot ETL-related problems.

Create and maintain technical documentation and reports for stakeholders.

Requirements?–

10 years of total technical experience on designing, developing, and implementing ETL solutions using Apache Spark.

Experience in complete Software Development Life Cycle which includes Systems Analysis of various applications in Client/Server Environment.

Working with various software applications with advanced knowledge (FiveTran, Airbyte, HVR, Apache flink, SQL)

Experience in creating Streaming data processing using Spark streaming, shell scripting, deployment activities, performance tuning and error handling skills.

Excellent hands-on experience in various data sources, transformations, Partition/De-partition, Databases, Datasets and JSON/XML/Parquet file formats

Experience in data replication from RDBMS source using HVR

Experience in working on airbyte/Apache flink is preferred

Analytical problem solving and business interaction skills.

Effective communication with entire offshore team, customer and all concerned teams on day-to-day basis; provide daily status reporting to all stakeholders.

Experience in the insurance (e.g., UW, Claim, Policy Issuance) or financial industry preferred.

?

Apply here : https://talenthubsolutions.com/

Srividya Chekuri

Senior Data Engineer || AWS Services, GCP, Azure || Pyspark || Apache Spark || Kafka || Databricks || Snowflake || ETL Tools || Big Data technologies : Hadoop, HIVE || Actively looking for #C2C or #C2H

3 个月

Please Share requirements to [email protected] I have 10+ years of experience in Data engineer role

回复
CHESTER SWANSON SR.

Next Trend Realty LLC./wwwHar.com/Chester-Swanson/agent_cbswan

3 个月

Thanks for Sharing.

Meghana S

Data Engineer | Big Data Engineer | Hadoop developer| Spark Developer | ETL Developer | Cloud Engineer | Snowflake Developer

3 个月

Please share to [email protected]

回复
Praneet G

Data Engineer at adMarketplace

3 个月

please share the requirement to [email protected]

回复
M Rao

Senior Bench Sales Recruiter at Dataquestcorp

3 个月

Please share requirement at [email protected]

回复

要查看或添加评论,请登录

C2C Requirements and Direct Clients / Prime vendor / Implementation的更多文章

社区洞察

其他会员也浏览了