Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Data Engineer Consultant PySpark ADF SQL 6 months contract 2 to 4 years experience required 1.
United Arab Emirates Jobs Expertini

Urgent! Data Engineer - Consultant PySpark ADF SQL 6 months contract 2 to 4 years experience required 1 Job Opening In Dubai – Now Hiring Virtua Advanced Solution

Data Engineer Consultant PySpark ADF SQL 6 months contract 2 to 4 years experience required 1



Job description

Itsa6 monthscontractroleextendablefurther based on client discretion.

Minimum 2 years of experience is required.

Budget 10k to 12k AED Visa Medical Insurance Work Permit.

Please let me know if you would be interested in the role or have any friends looking for job.

What Youll Do :

  • Design develop and maintain data pipelines for ingestion transformation and loading of data into the data warehouse.

  • Design develop and maintain data pipelines using PySpark and Azure Data Factory (ADF).

  • Implement data governance frameworks and ensure data quality security and compliance with industry standards and regulations.

  • Develop complex SQL queries and manage relational databases to ensure data accuracy and performance.

  • Establish and maintain data lineage tracking within the data fabric to ensure transparency and traceability of data flows.

  • Implement ETL processes to ensure the integrity and quality of data.

  • Optimize data pipelines for performance scalability and reliability.

  • Develop data transformation processes and algorithms to standardize cleanse and enrich data for analysis.

    Apply data quality checks and validation rules to ensure the accuracy and reliability of data.

  • Mentor junior team members review code and drive best practices in data engineering methodologies.

  • Collaborate with crossfunctional teams including data scientists business analysts and software engineers to understand data requirements and deliver solutions that meet business objectives.

    Work closely with stakeholders to prioritize and execute data initiatives.

  • Maintain comprehensive documentation of data infrastructure designs ETL processes and data lineage.

    Ensure compliance with data governance policies security standards and regulatory requirements.

Qualifications : What Youll Bring :

  • Strong proficiency in SQL and at least one programming language (e.g. Python) for data manipulation and scripting.

  • Strong experience with PySpark ADF Databricks and SQL
  • Preferable experience with MS Fabric.

  • Proficiency in data warehousing concepts and methodologies.

  • Strong knowledge of Azure Synapse and Azure Databricks.

  • Handson experience with data warehouse platforms (e.g. Snowflake Redshift BigQuery) and ETL tools (e.g. Informatica Talend Apache Spark).

  • Deep understanding of data modeling principles data integration techniques and data governance best practices.

  • Preferrable experience with Power BI or other data visualization tools to develop dashboards and reports.

Remote Work :

Yes

Employment Type :

Fulltime

#J-18808-Ljbffr


Required Skill Profession

Other General



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Data Engineer Potential: Insight & Career Growth Guide