platforms. The ideal candidate will have deep experience in Apache Spark Databricks PySpark and cloud platforms such as Azure... ResponsibilitiesDesign develop and maintain scalable data pipelines using Apache Spark on DatabricksWrite efficient and production ready...
and/or experience in many of the following: Batch or stream processing systems such as Apache Spark, Flink, Akka, Storm Message... brokers such as Kafka, AWS SQS, AWS SNS, Apache ActiveMQ, Kinesis AWS environment such as Lambda functions, SQS, Glue, Step...
systems, including Apache Spark and AWS Glue. Familiarity with scheduling tools like Apache Airflow and Perfect. Skilled...
systems, including Apache Spark and AWS Glue. Familiarity with scheduling tools like Apache Airflow and Perfect. Skilled...
transformation tools (dbt, Apache Spark, Azure Data Factory). Strong analytical skills and solid knowledge of computer science... using tools like Kafka, Spark, and cloud-native services. Establish best practices in data modelling, ETL/ELT, data...
systems, including Apache Spark and AWS Glue. Familiarity with scheduling tools like Apache Airflow and Perfect. Skilled...
systems, including Apache Spark and AWS Glue. Familiarity with scheduling tools like Apache Airflow and Perfect. Skilled...
systems, including Apache Spark and AWS Glue. Familiarity with scheduling tools like Apache Airflow and Perfect. Skilled...
systems, including Apache Spark and AWS Glue. Familiarity with scheduling tools like Apache Airflow and Perfect. Skilled...
pipelines and ETL workflows using big data platforms such as Cloudera. Work with Apache Spark for data processing.... Requirements Essential Skills: Proficiency in Scala, Java, and Python. Experience with Apache Spark and Cloudera ecosystem...
7+ years of experience designing and building data pipelines using Apache Spark, Databricks or equivalent bigdata... frameworks. Handson expertise with streaming and messaging systems such as Apache Kafka (publish subscribe architecture...
or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines... using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure...
or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines... using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure...
or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines... using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure...
or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines... using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure...
or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines... using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure...
., Apache Airflow) Big Data tools (e.g., Hadoop, Spark) Effective and fluent communication skill – Verbal and written Ability...
or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines... using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure...
or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines... using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure...
or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines... using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure...