Role: Senior Technology Engineer (Big Data – Hadoop)
Location: Dubai, United Arab Emirates
Engagement: Contractual
Duration: 12 Months (Extendable, Long-term with Marc Ellis)
Job Summary:
The Senior Technology Engineer (Hadoop) is responsible for architecting, implementing, and managing
scalable big data platforms with a primary focus on Hadoop-based ecosystems. This role also involves
supporting DevOps practices, CI/CD automation, container orchestration, and infrastructure management to
ensure the efficient and secure operation of data-intensive applications.
Key Responsibilities:
• DevOps and CI/CD: Design, implement, and manage CI/CD pipelines using tools like Jenkins and GitOps
to automate and streamline the software development lifecycle.
• Containerization and Orchestration: Deploy and manage containerized applications using Kubernetes
and OpenShift, ensuring high availability and scalability.
• Infrastructure Management: Develop and maintain infrastructure as code (IaC) using tools like
Terraform or Ansible.
• Big Data Solutions: Architect and implement big data solutions using technologies such as Hadoop,
Spark, and Kafka.
• Distributed Systems: Design and manage distributed data architectures to ensure efficient data
processing and storage.
• Collaboration: Work closely with development, operations, and data teams to understand requirements
and deliver robust solutions.
• Monitoring and Optimization: Implement monitoring solutions and optimize system performance,
reliability, and scalability.
• Security and Compliance: Ensure infrastructure and data solutions adhere to security best practices and
regulatory requirements.
Education & Experience:
• Degree, Postgraduate in Computer Science or related field.
• More than 8+ years of experience in Information Technology, with 3+ years spent in data engineering,
architecture and technology solutions definitions and implementations.
• Extensive experience in banking and financial services domain.
Technical Skills:
• Up-to-date knowledge of CDP versions, including the latest releases from 7.1.7 through 7.1.9.
• In-depth knowledge of Cloudera Manager for deploying, configuring, and monitoring CDP services.
• Strong understanding of security mechanisms such as Kerberos, LDAP/AD integration, and Transport
Layer Security (TLS).
• Collaborate with supporting teams (database, network, security, and system teams), conduct root cause
analysis of production issues, and provide corrective actions.
• Strong command of the Linux CLI, as it is the foundation for managing CDP environments.
• Skilled in automating repetitive tasks using scripting languages like Bash or Python.
• Ability to set up and manage monitoring solutions for CDP clusters to increase observability.
• Expertise in Data Architecture, Data Strategy, and Roadmap for large and complex organizations and
systems, with experience implementing large-scale end-to-end Data Management & Analytics solutions.
• Experience in transforming traditional Data Warehousing approaches to Big Data–based approaches,
with a proven track record in managing risks and data security.
• Expertise in DW dimensional modeling techniques, including Star and Snowflake schemas, modeling
slowly changing dimensions and role-playing dimensions, dimensional hierarchies, and data
classification.
• Experience with cloud-native principles, designs, and deployments.
• Extensive experience working in and enhancing Continuous Integration (CI) and Continuous Deployment
(CD) environments.
• Expertise in Data Quality, Data Profiling, Data Governance, Data Security, Metadata Management, and
Data Archival.
• Ability to define workload migration strategies using appropriate tools.
• Proven ability to drive delivery in a matrixed environment by working with various internal IT partners.
• Demonstrated ability to work in a fast-paced and changing environment with short deadlines,
interruptions, and multiple simultaneous tasks/projects.
• Ability to work independently with strong skills in planning, strategy, estimation, and scheduling.
• Strong problem-solving, influencing, communication, and presentation skills; self-starter.
• Experience with data processing frameworks and platforms (e.g., Informatica, Hadoop, Presto, Tez,
Hive, Spark, etc.).
• Hands-on experience with related/complementary open-source software platforms and languages (e.g.,
Java, Linux, Python, Git, Jenkins).
• Exposure to BI tools and reporting software (e.g., Microsoft Power BI and Tableau).
Functional & Soft Skills:
• Strong business acumen with the ability to align initiatives with organizational goals.
• Proven leadership capabilities in managing teams and driving organizational success.
• Experience in coaching and mentoring team members to foster growth and development.
• Excellent analytical skills with a data-driven decision-making approach.
• Demonstrated ability to think critically and apply systems thinking.
• Skilled in negotiation and influencing across multiple stakeholders.
• Highly disciplined and organized, with strong time management skills.
• Flexible and adaptable to shifting priorities and fast-paced environments.
• Visionary mindset with the capacity to anticipate trends and drive innovation.
• Results-oriented with a strong focus on performance and continuous improvement.