Job Position: PySpark Developer Job Location: Dallas, TX (Onsite) Job Duration: Fulltime... Job Description: Must Have Technical/Functional Skills: Associates with 7-10 years' experience in data/ETL testing with hands-on PySpark and exposure...
Role : Technology Lead - Engineering | Cloud Integration | Azure Data Factory (ADF) -- Databricks Developer Location... : Remote Job Type : Long Term Contract Job Summary: Must Have Skills Databricks Data warehouse ETL SQL Pyspark...
Job Type: FTE Location: Remote (can sit anywhere in the US) Overview The Azure Full Stack Developer will lead the... Lake, Synapse Spark SQL, Pyspark Azure Data Explorer Logic Apps, Key Vault Semi structured data processing Integration...
Job Title: Big Data Developer Location: Strongsville, OH /Pittsburgh, PA /Dallas, TX (Onsite) Job Type: Fulltime... Job Description: Must Have Technical/Functional Skills: Bigdata Hadoop & ecosystem, Scala/Python, PySpark. Oracle PL/SQL, CI/CD, Excellent communication...
Zone IT Solutions is seeking a Teradata and Hadoop Developer to enhance our data solutions team. In this role... both Teradata and Hadoop to drive business insights. Requirements 10+ years' experience as a Analyst or Developer. 7+ years...
such as Python and PySpark for executing data engineering tasks. Exceptional analytical and problem-solving skills, particularly..., from source to destination, including data cleansing, transformation and enrichment. Proficiency in PySpark for engineering data...
, networking, developer tooling, collaboration and more. Innovate with UI/UX designers, data scientists, cloud engineers... Proficiency with Apache Spark or PySpark for large-scale data processing Exposure to financial markets, traded products...
and PySpark Secondary Skills: MLOps and LLMOps Certifications Must Have: Microsoft Certified Azure Developer Associate...
requirements-gathering sessions, architect scalable technical solutions, and contribute as a developer to build and deliver working.../Fabric Data Factory, and Azure SQL. Strong hands-on background with Python, SQL, and PySpark for data engineering...