A well-known and well-reputed Company based in Dubai, United Arab Emirates is looking for an experienced, skilled, competent, mature, qualified, creative, and intelligent candidate with significant knowledge and relevant working experience for the position of “Data Engineer”.
|Experience||5 – 7 years of Relevant Experience Required|
|Monthly Salary||6,500 AED – 8,000 AED|
|Employment Type||Full Time | Permanent
|Company Size||50-100 Employees|
- This position requires a Bachelor’s Degree in Computer Science or a related technical field, and 5+ years of relevant employment experience
- 5+ years of work experience with ETL, Data Modeling, and Data Architecture.
- Expert-level skills in writing and optimizing SQL.
- Experience with Big Data technologies such as Hadoop/Hive/Spark.
- Solid Linux skills.
- Experience operating very large data warehouses or data lakes.
- At least 3 years of experience in Python.
Roles & Responsibilities
- Expertise in ETL optimization, designing, coding, and tuning big data processes using Apache Spark or similar technologies.
- Experience with building data pipelines and applications to stream and process datasets at low latencies.
- Show efficiency in handling data – tracking data lineage, ensuring data quality, and improving discoverability of data.
- Sound knowledge of distributed systems and data architecture (lambda)- design and implement batch and stream data processing pipelines, knows how to optimize the distribution, partitioning, and MPP of high-level data structures.
- Knowledge of Engineering and Operational Excellence using standard methodologies.
- In-depth hands-on experience on Amazon Web Services public cloud and their various services such as EC2, EBS, S3, SNS, SES, RDS, Redshift, SFTP, AWS Directory Services, Kinesis, etc.
- Hands-on experience in designing scalable, failover, deployable architecture and application migration on AWS.
- Being engaged in end-to-end Data Ingestion → Data Transformation → Data Persistence → Performance Optimization process.
- CosmosDB/MongoDB, Data Lake, Azure Streaming Analytics, Bot Framework
- Data Base integration, administration , reporting &governess
- They are expected to have an intrinsic understanding of AI concepts and machine learning
- Responsible for Data Integrity & Governance