Find your dream job now!

Click on Location links to filter by Job Title & Location.
Click on Company links to filter by Company & Location.
For exact match, enclose search terms in "double quotes".

Keywords: Python, AWS, Pyspark, Location: Bangalore, Karnataka

Page: 6

IN_Senior Associate_ Data engineer_Data and Analytics_Advisory_Bangalore

big data solutions. -Implement data processing pipelines using PySpark and Hadoop. -Develop and optimize SQL queries... to ensure optimal performance Mandatory skill sets: -Proficiency in Python: Strong programming skills in Python for data...

Company: PwC
Posted Date: 05 Oct 2025

Senior Data Engineer

to process large-scale vehicle fleet data using PySpark on AWS/Alibaba cloud infrastructure. - Collect, clean, and transform raw... test as well as simulation on real data (software in the loop) SKILLS Must have Python knowledge Spark/PySpark Linux...

Company: Luxoft
Posted Date: 05 Oct 2025

Senior Developer

dashboards for business insights and reporting. Create and optimize data workflows using Python and PySpark.. Serve... in Python, PySpark, and data pipeline development.-good Knowledge of Tableau for data visualization and reporting...

Posted Date: 03 Oct 2025

Technical Project Manager

with stakeholders Experience in AWS services and Python/Pyspark/Glue coding handling Certifications PgMP (Program Management... project taking complete ownership Good understanding of ETL and Data Warehousing concepts along with any cloud (AWS/Azure...

Company: NR Consulting
Posted Date: 03 Oct 2025

Data Engineer

language like python and any other programming language Candidate must have hands on experience in AWS Data Databricks Good... engineer (developer), having strong software development experience of 5 to 10 years on AWS DataBricks Qualifications...

Company: Varite
Posted Date: 03 Oct 2025

Data Engineer (Databricks)

, and implement highly scalable and efficient ETL/ELT processes using Databricks notebooks (Python/Spark or SQL) and other Databricks...: Extensive experience with Spark (PySpark, Spark SQL) for large-scale data processing. Deep understanding and practical...

Company: Varite
Posted Date: 03 Oct 2025

ETL Developer

. Qualifications: EXP- 5 – 9 years Mandatory Skills – Pyspark, Python, ADF/Spark/SQL/Synapse/Databricks at a high level, Devops (good..., or DevOps roles Strong ability in programming languages such as Pyspark, Python, C#, Scala or Power BI Experience with data...

Company: Varite
Posted Date: 03 Oct 2025

Data Engineer

, Pyspark, AWS, SQL : DE...Job Title: Data Engineer Experience:7+ years Location: Chennai(Preferred)/ Bangalore Must have skills : Python...

Company: NR Consulting
Posted Date: 03 Oct 2025

Palantir Foundry Data Engineer

Development: Build scalable, fault-tolerant ETL/ELT pipelines using technologies such as PySpark, SQL, and Palantir’s proprietary... Required Technical Proficiency: Strong coding skills in Python, Java, or Scala, and deep experience with SQL. Familiarity...

Posted Date: 03 Oct 2025

Big Data Engineer

with expertise in Python, PySpark, Hive, Hadoop HDFS, Oozie, and YARN to join our data engineering team. The ideal candidate... pipelines using PySpark and Python. Work with Hadoop HDFS to manage and process large volumes of structured and unstructured...

Company: Varite
Posted Date: 03 Oct 2025

Data Engineer

Databricks Developer to join our team. The ideal candidate will have expertise in Python, PySpark, SQL, and cloud platform...-on experience with Python, PySpark, Scala, SQL. Experience with Spark performance tuning and optimization. Knowledge of Delta Lake...

Company: Varite
Posted Date: 02 Oct 2025

I&I Data Engineer

: Pyspark/Python with AWS....

Company: NR Consulting
Posted Date: 02 Oct 2025

Databricks

: Databricks, Spark, Python / SQL Requirements Responsibilities Design, develop, and optimize data workflows and notebooks... and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices...

Company: Talent Worx
Posted Date: 02 Oct 2025
Salary: Rs.400000 - 1800000 per year

Data Transformation Architect

, Bigdata (Hadoop), Python and pyspark Possess excellent verbal & written communication skills Expertise in application, data... machine learning programming languages including Python, PySpark, or Scala 5 + years of experience with big-data technologies...

Company: Talent Worx
Posted Date: 02 Oct 2025

Lead I - Data Engineering

tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently... of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF...

Company: UST
Posted Date: 02 Oct 2025

Technical Services Manager

data tools such as PySpark Experience with relational SQL databases Experience with AWS cloud services used often... and answering questions related to the product Experience with scripting languages such as Python and Scala Experience with big...

Company: Autodesk
Posted Date: 26 Sep 2025

Sr Systems Developer(Data Engineer)

using AWS analytical services. Proficiency in Python & Pyspark. Ensure database security and compliance with HIPAA, GDPR... with collaborators. Knowledge, Skills, and Abilities: SQL Server to AWS migration using analytical services. Client-focused...

Posted Date: 26 Sep 2025

Lead II - Software Engineering

pipelines for batch and streaming data ingestion. Proficiency in Python and PySpark for distributed computing; knowledge..., and DevOps workflows. Knowledge of AWS cloud services (EC2, S3, EBS) and security best practices (e.g., Apache Ranger). Basic...

Company: UST
Posted Date: 26 Sep 2025

Lead Database Engineer

sources on the Cloud (AWS / GCP). Using well defined ETL tools like PySpark, Data Flow ● Data acquisition and clean up... understanding of Programming language, preferably Python for custom scripts ● Understanding No SQL and In memory databases...

Posted Date: 23 Sep 2025

Database Architect

, transformation, and loading of data from a wide variety of data sources on the Cloud (AWS / GCP). Using well defined ETL tools... like PySpark, Data Flow Data acquisition and clean up from multiple sources across the organization Design custom tools...

Posted Date: 23 Sep 2025