- About Us
- Business Areas
- Early Careers
- Locations
Your jobs
Date live:
Mar. 13, 2026
Business Area:
Corporate Digital Banking
Area of Expertise:
Technology
Reference Code:
JR-0000087928
Contract:
Permanent
Take a look at the map to see what’s nearby. Train stations & bus stops, gyms, restaurants and more.
Explore locationWe are seeking a highly experienced Senior Data Engineer with strong hands‑on expertise in building enterprise‑scale data pipelines on AWS using Apache Spark (Scala / PySpark). The role requires deep experience with modern lakehouse architectures, Apache Iceberg, dbt, and strong working knowledge of Snowflake and Databricks within an AWS ecosystem.
To be a successful Senior Data Engineer, you should have experience with:
Hands on experience in Data Engineering with strong enterprise delivery exposure.
Strong hands‑on experience building data pipelines using Apache Spark. (Scala & PySpark)
Proven experience building and operating enterprise‑scale data platforms, including IaC / Terraform.
Strong DVD skill set:
Design scalable and maintainable data solutions.
Validation. (data quality, reconciliation, correctness)
Delivery of production‑ready, operationally stable pipelines.
Experience with Airflow or similar orchestration tools, strong experience working in regulated / large enterprise environments.
Strong hands‑on experience with AWS services, including:
S3, AWS Glue (Jobs & Catalog), Athena, Lakehouse architectures, Step Functions / Lambda.
Hands‑on experience with Apache Iceberg, Experience integrating Snowflake and/or Databricks with AWS data lakes and metadata catalogs. Strong SQL and data modelling skills.
Additional relevant skills given below are highly valued:
Ab Initio ETL experience, Additional orchestration experience beyond Airflow.
Exposure to complex enterprise data governance and compliance patterns.
You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills.
The location of the role is Pune.
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
Vice President Expectations
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.