- About Us
- Business Areas
- Early Careers
- Locations
Your jobs
Date live:
Dec. 15, 2025
Business Area:
Markets Post Trade
Area of Expertise:
Technology
Reference Code:
JR-0000073322
Contract:
Permanent
Take a look at the map to see what’s nearby. Train stations & bus stops, gyms, restaurants and more.
Explore locationJoin us as a “Data Engineer" at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences.
You may be assessed on the key critical skills relevant for success in role, such as experience with, skills to meet business requirement as well as job-specific skillsets.
To be successful as a “Data Engineer", you should have experience with:
Basic/ Essential Qualifications:
Must be graduate/ must have bachelor’s degree.
Solid experience in data engineering, ETL and building and maintaining pipelines for structured, semi structured and unstructured data from multiple upstream systems.
Work with various formats and protocols (CSV, JSON, XML, Avro, Parquet, APIs , streaming feeds and messaging queues).
Develop scalable ETL/ELT workflows using AWS and big data frameworks.
Strong Experience in AWS( Redshift, Glue, Lambda, Step Functions, Cloud Formation templates, CloudWatch, API Gateway).
Excellent programming skills in Python or Pyspark.
Experience with ETL orchestration tools, workflow schedulers and CI/CD pipelines.
Excellent knowledge of Data Storage and Warehousing concepts.
Model and Maintain datasets within warehouses like S3, Data Lake, Redshift, Hive/Glue Catalog).
Experience with database systems (relational: Oracle, SQL Server, PostgreSQL, MySQL; columnar: Redshift, Snowflake; NoSQL: MongoDB, Cassandra).
Advanced SQL skills (DDL, DML, performance tuning) and scripting experience (PL/SQL, T-SQL, Python or Shell).
Knowledge of data warehousing concepts (Inmon/Kimball) and ETL tools (e.g., Informatica).
Cloud platform experience, ideally AWS (S3, Redshift, Glue) and Data Lake implementation.
Some other highly valued skills may include:
Experience with big data ecosystems (Hadoop, Databricks, Snowflake etc.)
Knowledge of Kafka and real-time flows with MSK.
Knowledge of Trino/Presto, Delta/Iceberg/Hudi.
Experience with Data Quality frameworks and metadata management.
Exposure to Post-trade settlement , clearing, reconciliations or financial markets preferred.
This role is based out of Pune.
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
Analyst Expectations
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.