Resume Score
CV/Résumé Score
  • Expertini Resume Scoring: See how well your CV/Résumé matches this job: AWS Software Engineer III Databricks/Python/Pyspark.
Wilmington | Expertini

Urgent! AWS Software Engineer III-Databricks/Python/Pyspark Job | JPMorgan Chase & Co.

AWS Software Engineer III Databricks/Python/Pyspark



Job description

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.

As an AWS Software Engineer III-Databricks/Python/PySpark at JPMorgan Chase within the Corporate Sector-Global Finance Team, you will be a key member of an agile team, tasked with designing and delivering cutting-edge products that are secure, stable, and scalable.

Your role involves implementing essential technology solutions across diverse technical domains to support the firm's business goals effectively.

Job responsibilities 

  • Developing and optimizing data pipelines and workflows to support data integration, transformation, and analysis
  • Implementing best practices for data management, ensuring data quality, security, and compliance
  • Writing secure, high-quality production code following AWS best practices, and deploying efficiently using CI/CD pipelines
  • Creating architecture and design documents for complex applications, ensuring software code meets design constraints
  • Identifying hidden issues and patterns in data to enhance coding practices and system architecture
  • Contributing to software engineering communities, promoting diversity, opportunity, inclusion, and respect within the team
  • Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience 
  • Experience with Spark and SQL
  • Expertise in Lakehouse/Delta Lake architecture, system design, application development, testing, and ensuring operational stability
  • Strong programming skills in Python/PySpark
  • Proficient in orchestration using Airflow
  • In-depth knowledge of Big Data and data warehousing concepts
  • Proficient in SQL/SparkSQL
  • Experience with CI/CD processes
  • Thorough understanding of the Software Development Life Cycle (SDLC)
  • Solid understanding of agile methodologies, including DevOps practices, application resiliency, and security measures
  • Proven expertise in software applications and technical processes within a specialized technical domain
  • Preferred qualifications, capabilities, and skills 

  • Experience in full-stack development with strong proficiency in Python 
  • Experience with Databricks, and the AWS cloud ecosystem
  • Familiarity with Snowflake, Terraform and LLM
  • Exposure to cloud technologies such as AWS Glue, S3, SQS/SNS, Lambda etc.
  • Familiarity with Data Observability, Data Quality, Query Optimization & Cost Optimization
  • AWS certifications such as SAA, Associate Developer, Data Analytics Specialty, or Databricks certification

  • Required Skill Profession

    Computer Occupations



    Your Complete Job Search Toolkit

    ✨ Smart • Intelligent • Private • Secure

    Start Using Our Tools

    Join thousands of professionals who've advanced their careers with our platform

    Rate or Report This Job
    If you feel this job is inaccurate or spam kindly report to us using below form.
    Please Note: This is NOT a job application form.


      Unlock Your AWS Software Potential: Insight & Career Growth Guide


    Advance your career or build your team with Expertini's smart job platform. Connecting professionals and employers in Wilmington, United States.