Capabilities
Your prior experience in the following:
Databricks, Python, Pyspark, Hadoop, Java, Spark, Scala, and Hive, NoSQL, Data processing / ETL
Qualifications
Bachelor of Engineering, Bachelor Computer Science or Master’s Degree
Your Team
You are sought for an integral role in a prestigious IT services, consulting, and business solutions organization with a 50-year history of global partnerships. As India’s largest multinational business group, operating in 46+ countries, they boast over 500,000 highly trained consultants. Pioneering financial markets infrastructure and data business, they seek your expertise to enhance our dedication to excellence in Data & Analytics, Capital Markets, and Post Trade services through open-access partnerships to shape the future of global service delivery.
Your Job
- Bring 7+ years of experience in Data processing/ETL and 3+ years in Databricks (essential) and Python.
- Take charge as a Senior Data Engineer, leading the build of an Enterprise Data platform.
- Establish scalable, robust data pipelines for healthcare data validation, ingestion, normalization/enrichment, and business-specific processing.
- Construct an Azure Data Lake using Databricks technology to centralize company-wide data for various products and services.
- Collaborate with engineering, product, program management, and operations teams to deliver pipeline platforms and build the Data Lake in alignment with business needs.
- Design, develop, and operate a scalable and resilient data platform to meet business requirements.
- Spearhead technology and business transformation by creating the Azure Data Lake.
- Uphold industry best practices in data pipelines, metadata management, data quality, governance, and privacy.
- Partner with Product Management and Business leaders for Agile delivery of existing and new offerings, optimizing the portfolio.
- Demonstrate proficiency in Cloud/Azure architectural components and building data pipelines and infrastructure.
- Possess a deep understanding of Data warehousing, reporting, and Analytical concepts.
- Exhibit expertise in the Big Data tech stack, including Hadoop, Java, Spark, Scala, Hive, and NoSQL data stores.