Link copied to clipboard!
Back to Jobs
Software Engineer III Python, AWS at JPMorganChase
JPMorganChase
Chicago, IL
Information Technology
Posted 0 days ago
Job Description
DescriptionWe have an exciting and rewarding opportunity for you to take your software engineering career to the next level.As a Software Engineer III at JPMorgan Chase within the Commercial and Investment Bank you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure stable and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firms business objectives. Job ResponsibilitiesDesign develop and maintain scalable data pipelines and ETL processes to support data integration and analytics.Frequently utilizes SQL and understands NoSQL databases and their niche in the marketplaceImplement best practices for data engineering ensuring data quality reliability and performanceContribute to data modernization efforts by leveraging cloud solutions and optimizing data processing workflowsPerform data extraction and implement complex data transformation logic to meet business requirementsMonitor and executes data quality checks to proactively identify and address anomaliesEnsure data availability and accuracy for analytical purposesIdentify opportunities for process automation within data engineering workflowsDeploy and manage containerized applications using Amazon ECS ( Kubernetes EKS) .Implement data orchestration and workflow automation using AWS step Event BridgeUse Terraform for infrastructure provisioning and management ensuring a robust and scalable data infrastructure. Required qualifications capabilities and skillsFormal training or certification on Software / Data Engineering concepts and 3 years applied experienceExperience across the data lifecycleProficient in SQL coding (e.g. joins and aggregations)Experience in Micro service based component using ECS or EKSExperience in building and optimizing data pipelines architectures and data sets ( Glue or Data bricks etl)Proficient in object-oriented and object function scripting languages (Python etc.)Experience in developing ETL process and workflows for streaming data from heterogeneous data sources ( Kafka)Experience building Pipeline on AWS using Terraform and using CI/CD pipelinesPreferred qualifications capabilities and skillsAdvanced knowledge of RDBMS like Aurora Open SearchExperience working with modern Lakehouse : Databricks Glue Experience with data pipeline and workflow management tools (Airflow etc.)Strong analytical and problem-solving skills with attention to detail.Ability to work independently and collaboratively in a team environment.A proactive approach to learning and adapting to new technologies and methodologies. Key Skills APIs,Docker,Jenkins,REST,Python,AWS,NoSQL,MySQL,JavaScript,Postgresql,Django,GIT Employment Type : Full-Time Experience: years Vacancy: 1 Monthly Salary Salary: 114000 - 155000
Resume Suggestions
Highlight relevant experience and skills that match the job requirements to demonstrate your qualifications.
Quantify your achievements with specific metrics and results whenever possible to show impact.
Emphasize your proficiency in relevant technologies and tools mentioned in the job description.
Showcase your communication and collaboration skills through examples of successful projects and teamwork.