Senior Databricks Data Engineer
Luxoft View all jobs
- Abu Dhabi
- Permanent
- Full-time
- Key Responsibilities:
- Develop and maintain scalable data pipelines and transformation workflows using Databricks, PySpark, and SQL.
- Support the migration of datasets, pipelines, and transformation logic from Palantir Foundry to Databricks Delta Lake.
- Work with investment and financial datasets, including market data, portfolio positions, transactions, pricing data, and risk metrics.
- Build and maintain ETL/ELT pipelines that enable data availability for investment analytics, reporting, and portfolio management systems.
- Implement data validation, reconciliation, and quality frameworks to ensure financial data accuracy and consistency.
- Optimize Spark jobs, cluster configurations, and storage formats to improve performance and cost efficiency.
- Maintain data lineage, documentation, and governance practices to meet financial industry standards.
- Support monitoring, troubleshooting, and performance tuning of production data pipelines.
- Mandatory Skills:
- Advanced Databricks Experience. Deep knowledge of Databricks architecture, Delta Lake, job orchestration, cluster management, and performance tuning.
- Experience working with investment data, such as market data, portfolio holdings, transactions, pricing data, risk metrics, and financial instruments
- Expert-Level PySpark & Python Skills. Strong ability to design, optimize, and refactor distributed data processing workflows.
- Advanced SQL & Data Modelling Expertise. Experience in dimensional modeling, lakehouse architecture patterns, and query optimization.
- Cloud Platform Experience (Azure preferred). Hands-on experience deploying and managing data platforms in cloud environments, including storage, security, and networking considerations.
- Strong Hands-on Expertise in Palantir Foundry. Proven experience with Foundry pipelines, ontologies, data lineage, transformations, and platform governance. • Proven Migration Experience from Palantir / to Databricks. Demonstrated experience leading or executing platform migrations, including pipeline conversion, data model redesign, and production cutover. • Familiarity with Dynatrace or Datadog for system observability and monitoring. • Databricks certification, cloud certifications (Azure/AWS), or enterprise data architecture certifications.