United Airlines Corporate Jobs

Mobile united Logo

Job Information

United Airlines Lead Engineer - Data Ops in Chicago, Illinois

We have a wide variety of career opportunities around the world — come find yours

Technology/IT

The United IT team designs, develops and maintains massively scaling technology solutions that are brought to life with innovative architectures, data analytics and digital solutions.

Job overview and responsibilities

The United Data Engineering team designs, develops and maintains massively scaling technology solutions that are brought to life with innovative architectures, data analytics and digital solutions. The Data Engineering team is building a modern data technology platform in the cloud with advanced DevOps and Machine Learning capabilities.

The Data Engineering team at United Airlines is on a transformational journey to unlock the full potential of enterprise data, build a dynamic, diverse and inclusive culture and develop a modern cloud-based data lake architecture to scale our applications, and drive growth using data and machine learning. Our objective is to enable the enterprise to unleash the potential of data through innovation and agile thinking, and to execute on an effective data strategy to transform business processes, rapidly accelerate time to market and enable insightful decision making. United Airlines is seeking talented people to join the Data Engineering team. Data Engineering organization is responsible for driving data driven insights & innovation to support the data needs for commercial and operational projects with a digital focus.

  • Partner with Development team to deploy multiple projects and maintaining software using public clouds such as AWS

  • Architecting, developing, deploying, and maintaining software at scale using Java and Spark

  • Provide expertise to build frameworks using PySpark which can be used for orchestration, monitoring pipelines and Integrate solutions with other applications and platforms outside the framework

  • Responsible for supporting Pre-Prod and Prod environments which includes deployments of application, monitoring of pipelines and resolution, disaster recovery, incident / problem management

  • Partner with various teams to define and execute data acquisition, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth

  • Design, Develop and implement streaming and near-real time data pipelines that feed systems that are the operational backbone of our business

  • Develop and implement innovative solutions leading to automation

  • Use of Agile methodologies to manage projects

  • Mentor and train junior engineers

Required

  • Bachelor’s degree in computer science or related STEM field

  • 4-8+ years of IT experience supporting critical enterprise infrastructure environments

  • 5+ years of professional experience managing data pipelines and infra using Spark, Flink, Airflow

  • 3+ years experience of engineering, architecting, or supporting AWS solutions

  • 3+ years of development experience using Java, Python, Scala

  • 2+ years of experience with Big Data technologies like Spark, EMR and streaming technologies like Kafka, Kinesis

  • Computer skills

  • Experience with infrastructure designs, implementation, and support

  • Experience with relational database systems like MS SQL Server, Oracle, Teradata

  • Experience of implementing and supporting AWS based instances and services (e.g. EC2, S3, EBS, ELB, RDS, IAM, Route53, Cloud front, Elastic cache, WAF etc.)

  • Experience of building and maintaining scalable and auto-scaled environments using automation and configuration management tools (e.g. CloudFormation/Terraform and Ansible) beneficial or can be gained

  • Scripting ability in one or more of Python, Bash, Perl

  • Experience of using Git for version control useful

  • Experience of working with or supporting containerized environments (ECS/EKS/Kubernetes/Docker)

  • Must be legally authorized to work in the United States for any employer without sponsorship

  • Successful completion of interview required to meet job qualification

  • Reliable, punctual attendance is an essential function of the position

Preferred

  • Masters in computer science or related STEM field

  • DevOps experience with AWS or Azure

  • AWS Certified DevOps Engineer - Professional

  • AWS Certified Solutions Architect - Associate or Professional

  • AWS Certified Specialty certification – (Security or Networking)

  • Experience with Data Quality tools including Deequ or Apache Griffin

  • Experience building PySpark based services in a production environment

  • Strong experience with continuous integration & delivery using Agile methodologies

  • Excellent O/S knowledge and experience in Linux or Windows with basic knowledge of the other. MCSE/RHCE or equivalent level of knowledge

Equal Opportunity Employer – Minorities/Women/Veterans/Disabled/LGBT

Division: 47 Technology/IT

Function: Information Technology

Equal Opportunity Employer – Minorities/Women/Veterans/Disabled

DirectEmployers