About Your Job :
In this role, you will create reliable architectures for building highly scalable data pipelines to collect a large amount of data from different sources and transform it into a usable format for analysis.
You will design, implement, maintain a full suite of real-time and batch jobs that fuels our cutting edge data analytics platform to provide real-time intelligence to our businesses.
Key Responsibilities :
Be a full stack data engineer to build highly scalable, high quality data products.
Create and maintain optimum data pipeline architecture for ingestion, storage, processing & transformation of data for building data products for analytics
Ensure data product is complete, cleansed and rules have been defined for outliers
Ensure Audit, Balance & Control are implemented for Consistent, Complete & Comprehensive Data Quality and Data Integrity management
Ensure compliance to data strategy and governance and integrate organization’s data management processes in to data product development.
Accountable for implementing modern-day security controls for sensitive data products
Write automated unit test cases for data products and integrate to CI / CD pipeline
Develop data quality routines
24 / 7 support of data products developed
Identifies and manages reference data
Build solutions which are scalable, resilient and sustainable to address business requirements.
Tackle challenges and solve complex problems on a daily basis
Be part of an extraordinary story
Your skills. Your imagination. Your ambition. Here, there are no boundaries to your potential and the impact you can make.
You’ll find infinite opportunities to grow and work on the biggest, most rewarding challenges that will build your skills and experience.
You have the chance to be a part of our future, and build the life you want while being part of an international community.
Our best is here and still to come. To us, impossible is only a challenge. Join us as we dare to achieve what’s never been done before.
Together, everything is possible
About you :
The applicant should have a Bachelor’s Degree or equivalent (Degree in engineering, computer applications, commerce, or business administration).
You must have minimum 5 years of data engineering experience. Should have excellent verbal and written communications skills.
Also possess good analytical, interpersonal skills and a proven team player.
Skill Sets : Required
4+ years hands-on data engineering experience in working with big data using technologies such as Hadoop, Spark, Kafka, Python, Scala, Azure Data Factory, Azure Data Lake Store, Azure Databricks.
Hands-on experience in data modelling, data pipeline design & development
Strong technical knowledge of performance tuning and query optimization on large data sets.
Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modelling and design techniques.
Experience in Datawarehouse concepts.
Knowledge about data virtualization, semantic layer tool dremio.
Knowledge of airline domain.
Knowledge of agile / lean development methodologies.