Home » Big Data Solution Architect

Big Data Solution Architect


Big Data Solution Architect Abu Dhabi


About the job

Saal.ai is an innovative AI company based in Abu Dhabi, UAE. We build advanced cognitive solutions, products, and platforms for businesses that solve challenging real-life problems for customers.

Leveraging the power of AI and the myriad of options resulting from human-machine interaction,

Saal.ai has strived to unlock the exponential growth opportunities for humanity to live meaningful and compassionate lives.

The uniqueness of Saal’s framework of solutions lies in the flexibility it offers, being optimized continuously to integrate and suit any business seamlessly.


Key Responsibilities:


  • Assist the team on Business Analysis, Requirements Gathering, Data Analysis, Data Modeling, Project Management, Project Estimation
  • Define, Design and develop services and solutions around large data ingestion, storage and management such as with RDBMS, No SQL DBs, Log Files, and Events.
  • Work with third-party and other internal providers service to support a variety of integrations.
  • Define, Design and run robust data pipelines/batch jobs in a production environment.
  • Architecting highly scalable, highly concurrent and low latency systems
  • Working with product teams on a range of tools and services, improving products to meet user needs.
  • Participating in sprint planning to work with developers and project teams to ensure projects are deployable and monitorable from the outside.
  • As part of the team you may be expected to participate some of the 2nd line in-house support and Out-of-Hours support.
  • Proactively advise on best practices.
  • Assist in budgeting process.
  • Excellent written and spoken communication skills; an ability to communicate with impact, ensuring complex information is articulated in a meaningful way to wide and varied audiences including senior executives, business, architecture governance bodies and IT delivery Ability to work globally and across cultures.


Education, Experience and Required Skills


  • Degree in Computer Science, Software Engineering or related preferred
  • Minimum 10 years of relevant experience in the field
  • Experience of big data environments (also advising best practices/new technologies to Analytics team)
  • Proven track record architecting and implementing large scale Data programs end-to-end
  • Expert in designing and implementation of Big Data solutions (Spark, Hadoop ecosystem, Nifi,Kafka, NoSQL databases and Document DBs) and data architecture patterns (data warehouse, data lake, streaming)
  • Expert in full range of data engineering approaches, covering theoretical best practices and the technical applications of these methods
  • Knowledge of AI ML.
  • Able to understand Cluster administration, troubleshooting issues
  • Experience as Solution Architect for large scale Analytics, Insight, Decision Making and Reporting solutions, based on Big Data technology
  • Experience configuring and managing Linux servers
  • In depth knowledge of Hadoop technology ecosystem – HDFS, Nifi, Spark, Hive, Impala, Hbase, Kafka, Flume, Sqoop, Oozie, SPARK, Avro, Parquet
  • Experience debugging a complex multi-server service.
  • In depth knowledge and experience in IaaS/PaaS solutions (eg AWS Infrastructure hosting and managed services)
  • Familiarity with network protocols – TCP/IP, HTTP, SSL, etc.
  • Understanding continuous integration and delivery.
  • Experience working in an agile environment.
  • Knowledge of the use of version control systems such as git or subversion.



To apply for this job please visit www.linkedin.com.