Link copied to clipboard!
Back to Jobs
SI
GCP Data Engineer with Python at Saransh Inc
Saransh Inc
Anywhere
Information Technology
Posted 0 days ago
Job Description
Role: GCP Data Engineer with Python Location: Dearborn MI (4 days a week onsite) Job Type: Contract Experience: Overall 8 to 12 years Job Summary: The Data Engineer will be responsible for supporting the Credit Global Securitization (GS) teams upskilling initiative by contributing to data engineering efforts across cloud and traditional platforms. This role is intended to accelerate development and delivery. The engineer will work closely with cross-functional teams to build optimize and maintain data pipelines and workflows using GCP Python and ETL tools. Required Technical Skills: Minimum 3 years of hands-on experience with Google Cloud Platform (GCP) specifically using Astronomer/Composer for orchestration. Strong proficiency in Python for data engineering and automation. Experience with RDBMS technologies such as DB2 and Teradata. Exposure to Big Data ecosystems and distributed data processing. Nice to have Technical Skills: Prior experience with ETL tools like DataStage or Informatica. Responsibilities: The Data Engineer will play a key role in the developing and maintaining scalable data pipelines and workflows. The engineer will work with GCP tools like Astronomer/Composer and leverage Python for automation and transformation tasks. The role involves integrating data from RDBMS platforms such as DB2 and Teradata and supporting ETL processes using tools like DataStage or Informatica. The engineer will collaborate with existing team members including Software Analysts and Scrum Masters and will be expected to contribute to knowledge sharing and process improvement. Specifically: Develop and implement solutions using GCP Python Big Data technologies to enhance data analysis capabilities. Collaborate with cross-functional teams to design and optimize data models in Teradata and DB2 environments. Utilize Python for scripting and automation to streamline geospatial data processing tasks. Integrate and manage data workflows using Cloud Composer to ensure efficient data pipeline operations. Leverage GCP Cloud to deploy scalable applications and services. Key Skills Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala Employment Type : Full Time Experience: years Vacancy: 1
Resume Suggestions
Highlight relevant experience and skills that match the job requirements to demonstrate your qualifications.
Quantify your achievements with specific metrics and results whenever possible to show impact.
Emphasize your proficiency in relevant technologies and tools mentioned in the job description.
Showcase your communication and collaboration skills through examples of successful projects and teamwork.