We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible.
As a Lead Software Engineer at JPMorgan Chase within the Consumer & Community Banking Administration, you will be a key member of our agile team. Your role will involve enhancing, building, and delivering trusted technology products in a secure, stable, and scalable way. You will play a crucial role in maximizing the value of cloud-agnostic solutions, ensuring architectural designs are resilient and stable. You will also be involved in data-promoten analysis, monitoring, and visibility solutions. Your role will include building and deploying Infrastructure as code in CI/CD, automating processes, and conducting blameless Root Cause Analysis. You will collaborate with diverse teams and explore emerging technologies such as AI. This role provides an opportunity to influence modern designs, develop, test, and deploy applications and cloud-native microservices, transforming the business processes and user experience for JPMorgan Chase’s Customer Acquisition and Marketing Platform.
Job responsibilities
Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Adds to team culture of diversity, equity, inclusion, and respect Review architecture and design artifacts for complex applications accountable for ensuring where appropriate highly resilient stable designs Gather, analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems to produce standardized metrics to show application health and break point volumes Review all Applications issues/Infrastructure failures as part of RCA process understand true scope and develop/implement remediation plans Ensure all deployments have scalable architected designs and there are no choke point capacity issues
Required qualifications, capabilities, and skills
Formal training or certification on software engineering concepts and 5+ years of applied experience Required SRE experience in a cloud environment Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming languages like python, shell scripting, java. Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Strong communication & Presentation skills with an ability to articulate technical matters to a broad audience Practical cloud deployment experience in AWS Cloud services and native data technologies such as AWS EMR, Glu, Lambda, MSK, RDS, Document DB, EC2, EKS, ECS, Oracle, Aurora, Dynamo DB etc.Preferred qualifications, capabilities, and skills Understanding of database design concepts (Pluggable DB’s) and data modeling for relational or nonrelational databases such as RDBMS (Oracle/Postgres), No SQL (MongoDB, Cassandra) or NewSql (Cockroach DB) and search DB’s like Elasticsearch incorporating multi-master across multi-regions Understanding of data design & modeling principles as well as architecture patterns such as data lake, lakehouse, data mart, data fabric and data mesh with experience in Data migrations and Open/Linked data media types such as RDF, Turtle, JSON-LD will be useful Experience with deploying to public and/or private cloud ideally with multi-cloud experience, ideally Cloud agnostic (Snowflake or another public cloud) and Active/Active across region designs, AWS Solution Developer, DevOps, FinOps Certificates etc. is a plus Experience in one or more big data processing frameworks such as Spark, Flink, Storm etc. with stream processing experience using Kafka Understand the power of AI/ML against appropriate data to drive operational excellence