- About Us
- Business Areas
- Early Careers
- Locations
Your jobs
Date live:
Sep. 15, 2025
Business Area:
Customer Digital and Data
Area of Expertise:
Technology
Reference Code:
JR-0000061147
Contract:
Permanent
Take a look at the map to see what’s nearby. Train stations & bus stops, gyms, restaurants and more.
Explore locationJoin us as a Data Engineer - PySpark and AWS responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences.
To be successful as a Data Engineer - PySpark and AWS you should have experience with:
Hands on experience in pyspark and strong knowledge on Dataframes, RDD and SparkSQL
Hands on Experience in developing, testing and maintaining applications on AWS Cloud.
Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena)
Design and implement scalable and efficient data transformation/storage solutions using Snowflake.
Experience in Data ingestion to Snowflake for different storage format such Parquet, Iceberg, JSON, CSV etc
Experience in using DBT (Data Build Tool) with snowflake for ELT pipeline development.
Experience in Writing advanced SQL and PL SQL programs.
Hands On Experience for building reusable components using Snowflake and AWS Tools/Technology
Should have worked at least on two major project implementations.
Exposure to data governance or lineage tools such as Immuta and Alation is added advantage.
Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage.
Knowledge on Abinitio ETL tool is a plus
Some other highly valued skills may include:
Ability to engage with Stakeholders, elicit requirements/ user stories and translate requirements into ETL components
Ability to understand the infrastructure setup and be able to provide solutions either individually or working with teams.
Good knowledge of Data Marts and Data Warehousing concepts.
Resource should possess good analytical and Interpersonal skills.
Implement Cloud based Enterprise data warehouse with multiple data platform along with Snowflake and NoSQL environment to build data movement strategy.
You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills.
The role is based out of Chennai.
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
Analyst Expectations
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.