Back to Jobs
JPMorganChase

Cloud ETL Software Engineer III at JPMorganChase

JPMorganChase Columbus, NE

Job Description

DescriptionWe have an exciting and rewarding opportunity for you to take your software engineering career to the next level.As a Software Engineer III at JPMorgan Chase within the Corporate Technology Finance and Risk Warehouse SRE Team you will solve complex and broad business problems with simple and straightforward solutions. Through code and cloud infrastructure you will configure maintain monitor and optimize applications and their associated infrastructure to independently decompose and iteratively improve on existing solutionsJob responsibilitiesGuides and assists others in the areas of building appropriate level designs and gaining consensus from peers where appropriateCollaborates with other software engineers and teams to design and implement deployment approaches using automated continuous integration and continuous delivery (CI/CD) pipelinesCollaborates with other software engineers and teams to design develop test and implement availability reliability scalability and solutions in their applicationsImplements infrastructure configuration and network as code for the applications and platforms in your remitCollaborates with technical experts key stakeholders and team members to resolve complex problemsUnderstands service level indicators and utilizes service level objectives to proactively resolve issues before they impact customersSupports the adoption of site reliability engineering best practices within the teamRequired qualifications capabilities and skillsFormal training or certification on software engineering concepts and 3 years applied experienceStrong analysis research investigation and evaluation skills with a structured approach to problem solving.Specialized ETL knowledge in SparkExperience with monitoring and observability tools including Dynatrace Open Telemetry (OTEL) Prometheus Datadog and Grafana particularly in dashboard developmentProficient in at least one programming language such as Python Java/Spring Boot Scala and/Working knowledge of Kubernetes Dockers any other containers technologyExperience managing and developing/deploying on Cloud (private cloud or public cloud)Knowledge of GIT BitBucket Jenkins SONAR SPLUNK Maven AIM and Continuous Delivery toolsUNIX file management & administration and good shell scripting experienceProduction working knowledge of Databricks and Apache Airflow on AWSWilling to work weekend supportPreferred qualifications capabilities and skillsDeveloping/deploying and running Ab Initio (ETL Tool) on a public Cloud like AWSAWS and/or Databricks certificationExperience developing and running data pipelines using PySparkOracle (v9i/10/11/19c ) running on Exadata Ansi SQLPL /SQL Stored Procedures support/developmentWorking Knowledge of Control-M/Autosys scheduling packageKnowledge/experience in Hadoop environment administration release deployments to Hive/HBase supervising Hadoop jobs performing cluster coordination services Key Skills SQL,Pentaho,PL/SQL,Microsoft SQL Server,SSIS,Informatica,Shell Scripting,T Sql,Teradata,Data Modeling,Data Warehouse,Oracle Employment Type : Full-Time Experience: years Vacancy: 1

Resume Suggestions

Highlight relevant experience and skills that match the job requirements to demonstrate your qualifications.

Quantify your achievements with specific metrics and results whenever possible to show impact.

Emphasize your proficiency in relevant technologies and tools mentioned in the job description.

Showcase your communication and collaboration skills through examples of successful projects and teamwork.

Explore More Opportunities