Data Engineer
· Design and maintain scalable data pipelines aligned with lakehouse standards.
· Implement data integration from multiple systems into the lakehouse environment.
· Handle ingestion of structured/unstructured, streaming, and batch data efficiently.
· Apply transformations using Spark (preferably PySpark) and optimise workloads for performance and cost.
· Ensure data quality, governance, and consistency across datasets.
· Collaborate with cross-functional teams to support data analytics and reporting requirements.
· Contribute to improving architecture and workflow reliability.
Requirements
· Strong analytical skills with hands-on experience in SQL and Python (PySpark preferred).
· Solid understanding of data lake, Delta Lake, and lakehouse concepts.
· Familiarity with big data technologies such as Parquet storage formats and Unity Catalog.
· Experience with Spark and distributed data processing frameworks.
· Understanding of cloud data platforms and cluster/workload management.
· Experience using Databricks is considered an asset.
· Excellent verbal and written English communication skills.
· Previous experience collaborating with data engineering or analytics teams.
You will also need to:
· Demonstrate enthusiasm for working with large-scale data systems.
· Be proactive and adaptable in a collaborative environment.
· Think creatively and contribute to continuous improvement.
About the job
Posted Job
30 Nov, 2025Function
Information TechnologyEmployment Type
Full timeWork Model
RemoteCountry
EgyptCity
CairoCareer Level
OtherNo. of Openings
1Role Details & Requirements
Years of Experience
4Minimum Education
BachelorLanguage Skills
englishKey Skills
Python,SQL,PySpark,data lake,delta lake,lakehouseCompensation & Benefits
Salary Range
-Currency
EGPAdditional Benefits
-Recruitment Service Details
Urgency Level
ASAPAccepted Notice
1 MonthRequired Service
CVs and Screening FeedbackRecruitment Budget
-Payment Terms
-