Esperienze migliorir. Rese possibili da te.

Sii te stesso. Cresci a modo tuo. Lavora su progetti interessanti.

Be yourself. Grow your own way. Work on interesting projects.

Airflow Python Developer

Contract Type:

Brick and Mortar

Location:

Hyderabad - TS

Date Published:

02-17-2026

Job ID:

REF40173U

Company Description:

About Sutherland
Artificial Intelligence. Automation. Cloud engineering. Advanced analytics. For business leaders, these are key factors of success. For us, they’re our core expertise.
We work with iconic brands worldwide. We bring them a unique value proposition through market-leading technology and business process excellence.
We’ve created over 200 unique inventions under several patents across AI and other critical technologies. Leveraging our advanced products and platforms, we drive digital transformation, optimize critical business operations, reinvent experiences, and pioneer new solutions, all provided through a seamless “as a service” model.
For each company, we provide new keys for their businesses, the people they work with, and the customers they serve. We tailor proven and rapid formulas, to fit their unique DNA. We bring together human expertise and artificial intelligence to develop digital chemistry. This unlocks new possibilities, transformative outcomes and enduring relationships.
Sutherland
Unlocking digital performance. Delivering measurable results.

Job Description:

Sutherland is looking for a skilled Python Data Engineer with strong experience in Apache Airflow, data pipeline development, and cloud data platforms (Snowflake / AWS). The role involves building and orchestrating scalable ETL/ELT workflows and automating data processes across multiple systems.

  • Develop and maintain data pipelines using Python, Airflow (DAGs), and AWS/Snowflake components.
  • Build and automate data ingestion, transformation, and scheduling workflows.
  • Develop Airflow DAGs including custom operators, sensors, hooks, and manage pipeline monitoring.
  • Work on Snowflake-based ELT solutions including data loads, stored procedures, and queries.
  • Write efficient SQL queries and optimize performance for data transformations.
  • Collaborate with cross-functional teams to understand requirements and deliver scalable data solutions.
  • Troubleshoot pipeline failures and ensure high availability of production workflows.

Qualifications:

  • 5–8 years of experience in Python development (advanced scripting and automation).
  • 3+ years of experience with Apache Airflow (DAG design, orchestration, scheduling).
  • Experience with AWS services (S3, Glue, Lambda, Athena) or equivalent cloud technologies.
  • Strong hands-on experience with SQL (advanced querying, optimization).
  • Experience with ETL/ELT data workflows, data validation, data quality checks.
  • Good to have Snowflake or any cloud data warehouse (Redshift / BigQuery / Databricks).
  • Familiarity with Git / CI-CD, JIRA, or similar tools.
  • Good communication skills and ability to work independently.
  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience)

Additional Information:

All your information will be kept confidential according to EEO guidelines.

Candidati Ora
Percorso di Carriera
Lavora da Casa

Condividi questo Lavoro

Sei interessato a questo lavoro?
Salva lavoro
Crea Come Avviso

Opportunità Simili:

SCHEMA MARKUP ( This text will only show on the editor. )