Responsibilities of the Candidate:
- Interact with GDIA product lines and business partners to understand data engineering opportunities, tools, and needs.
- Collaborate with Data Engineers, Technical Anchors and Data Architecture to design and build templates, pipelines and data products including automation, transformation and curation using best practices
- Develop custom cloud solutions and pipelines with GCP native tools – Big Query, Dataproc, Data Fusion, Data Flow, DBT, Airflow DAGs and Terraform/Tekton
- Operationalize and automate data best practices:?quality, auditable, timeliness and complete
- Participate in design reviews to accelerate the business and ensure scalability
- Leverages logging tools such as Dynatrace to support DevOps and debug production issues
- Follows software craftsmanship standards
- Pair program frequently to stay ahead of on the code base / tech stack.
- Coordinate reuse, architecture, and interoperability strategy.
- Foster DevOps CI/CD infrastructure and an Automated Testing mentality and capability. Champion continuous technical improvement for the platform, pursue tech debt opportunities.
- Help Product Owners understand our iterative development approach and focus on delivering a Minimum Viable Product through careful and deliberate prioritization.
- Grow technical capabilities / expertise and provide guidance to other members on the team
- Bachelor’s or Master’s degree in a Computer Science, Engineering or a related or related field of study
- Ability to work effectively across organizations, product teams and business partners.
- Knowledge Agile (Scrum) Methodology, experience in writing user stories
- Strong understating of Database concepts and experience with multiple database technologies – optimizing query and data processing performance.
- Full Stack Data Engineering Competency in a public cloud – Google Cloud or MS Azure or AWS
- Highly Proficient in SQL, Python, Java, Scala, or Go (or similar) - Experience programming engineering transformation in Python or a similar language
- Demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion
- Deep understanding of data service ecosystems including data warehousing, lakes, metadata, meshes, fabrics and AI/ML use cases.
- Effective Communication both internally (with team members) and externally (with stakeholders)
- Knowledge of Data Warehouse concepts – experience with Data Warehouse/ ETL processes
- Strong process discipline and thorough understating of IT processes (ISP, Data Security).
- Strong in Software Programming/Engineering with a good understanding of DevOps, GitHub etc
Interested and Eligible candidates can apply for Ford Off Campus Recruitment 2024 in online by the following link as soon as Possible Before Last Date.
Attended government polytechnic, bighapur, unnao
3 个月I'm interested
Writer
3 个月Khushi Kamboj is interested, https://www.dhirubhai.net/in/khushi-kamboj-1848b7223?utm_source=share&utm_campaign=share_via&utm_content=profile&utm_medium=android_app
Writer
3 个月Prajakta Sankpal is interested, https://www.dhirubhai.net/in/prajakta-sankpal-201016186?utm_source=share&utm_campaign=share_via&utm_content=profile&utm_medium=android_app
Writer
3 个月Patlolla Bhavana is interested, https://www.dhirubhai.net/in/patlolla-bhavana-673a46298?utm_source=share&utm_campaign=share_via&utm_content=profile&utm_medium=android_app
Digital business owner Independent housewife Helping the student to earn through social media
3 个月Dear friends ?? Unlock new possibilities, and chart a path to financial freedom! ?? [Work from home ?? ] [Smart work ??] [Qualification - 12th pass] [Age - 18+] Note - Hindi is compulsory ?????? ????join the journey ?? https://wa.link/3blf1i