Skip to main content

Data Engineer III

Alter Domus Alter Domus

Hyderabad, India

Hybrid

Apply now

Data Engineer III

ABOUT US:

As a world leading provider of integrated solutions for the alternative investment industry, Alter Domus (meaning “The Other House” in Latin) is proud to be home to 90% of the top 30 asset managers in the private markets, and more than 6,000 professionals across 24 jurisdictions. 

With a deep understanding of what it takes to succeed in alternatives, we believe in being different - in what we do, in how we work and most importantly in how we enable and develop our people. Invest yourself in the alternative, and join an organization where you progress on merit, where you can speak openly with whoever you are speaking to, and where you will be supported along whichever path you choose to take. 

Find out more about life at Alter Domus at careers.alterdomus.com  

Job Summary:

Join our dynamic and innovative team as a Data Engineer III, where you will take ownership of designing, building, and optimizing our data infrastructure and pipelines. As an experienced member of the team, you will work independently on complex data engineering challenges while beginning to provide technical guidance to junior engineers. Working closely with business stakeholders, data scientists, and cross-functional teams, you'll ensure data is reliable, efficient, secure, and accessible at scale. You will contribute to architectural decisions and help drive best practices across the organization.

Key Responsibilities:

  1. Data Pipeline Orchestration
    • Design, build, and maintain complex end-to-end data pipelines using Airflow (including managed services like Amazon MWAA) to orchestrate, schedule, and monitor batch/streaming workflows at scale.
    • Develop and optimize advanced DAGs (Directed Acyclic Graphs) with sophisticated retry logic, error handling, alerting, and monitoring to ensure high data quality and pipeline reliability.
    • Contribute to the establishment of best practices and standards for pipeline development, testing, and deployment.
  2. Data Ingestion & Transformation
    • Design and implement scalable data integration solutions using Airbyte for ingestion and dbt for transformations in a modular, maintainable fashion.
    • Collaborate with Data Analysts, Data Scientists, and business stakeholders to implement complex transformations and business logic, ensuring data is analytics-ready and optimized for performance.
    • Handle complex data ingestion scenarios from diverse sources with varying data formats and structures.
  3. Data Modeling & Warehousing
    • Design and implement efficient and scalable data models for both structured and semi-structured data in AWS S3 (data lake) and Snowflake (data warehouse).
    • Leverage Databricks for advanced data processing, analytics, and real-time insights capabilities.
    • Ensure data schemas and transformations support advanced analytics, BI reporting, and machine learning use cases while maintaining performance and scalability.
    • Optimize data warehouse design for query performance and cost efficiency.
  4. Data Governance & Security
    • Utilize AWS Lake Formation APIs and best practices to maintain data security, access controls, and compliance standards.
    • Work closely with IT security to implement and maintain robust encryption standards, audit trails, and identity/role-based access controls.
    • Ensure data governance policies are implemented consistently across all data assets.
  5. Performance Optimization
    • Monitor, analyze, and tune Airflow DAGs, Snowflake queries, AWS Athena configurations, and Databricks jobs to optimize throughput, reliability, and cost-effectiveness.
    • Implement advanced optimization techniques including data partitioning, clustering, and caching strategies.
    • Proactively identify and resolve performance bottlenecks through monitoring and alerting solutions.
  6. Technical Guidance & Collaboration
    • Provide technical guidance and mentorship to junior data engineers, sharing knowledge and best practices.
    • Lead cross-functional partnerships with DevOps, Platform Engineering, Data Science, and business teams to ensure seamless integration of data workflows and systems.
    • Communicate technical solutions effectively to both technical and non-technical stakeholders, translating business requirements into scalable technical solutions.
    • Participate in architectural decisions and provide input on data engineering strategy.
  7. Continuous Improvement
    • Actively participate in and occasionally lead architecture reviews, design reviews, and code reviews to ensure quality, scalability, and alignment with best practices.
    • Stay current with emerging trends in data engineering, orchestration tools (Airflow, MWAA), and cloud services (AWS, Snowflake, Databricks), and recommend new technologies or approaches.
    • Contribute to DevOps and DataOps best practices, including CI/CD for data pipelines, infrastructure as code, and automated testing.

Qualifications & Skills:

  • Education & Experience
      • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
      • 4+ years of experience as a Data Engineer or in a similar role working with cloud-based data platforms.
      • Demonstrated track record of delivering complex data engineering projects independently.
      • Experience providing technical guidance to other engineers is a plus.
    • Technical Skills
      • Cloud & Orchestration:
        • Advanced proficiency with Airflow (self-managed or managed services like Amazon MWAA) for workflow orchestration, DAG development, and scheduling.
        • Strong understanding of Airflow best practices including DAG structure, dependency management, error handling, scaling, and optimization.
      • AWS Expertise:
        • Strong hands-on experience with AWS Lake Formation, S3, Athena, and related services (e.g., Lambda, Glue, IAM, CloudWatch).
        • Working knowledge of AWS infrastructure as code (Terraform, CloudFormation).
      • Snowflake:
        • Advanced proficiency in designing and implementing data warehouses, configuring security, optimizing complex queries, and managing Snowflake performance.
        • Experience with Snowflake cost optimization techniques.
      • Databricks:
        • Strong experience with Databricks for large-scale data processing, analytics, and real-time data insights.
        • Familiarity with Spark fundamentals and basic performance tuning.
      • Data Ingestion & Transformation:
        • Strong experience with Airbyte or similar tools for complex data ingestion scenarios.
        • Advanced proficiency with dbt or other SQL-based transformation frameworks for modular, maintainable data processing.
        • Exposure to real-time streaming technologies (e.g., Kafka, Kinesis) is beneficial.
      • Programming:
        • Advanced proficiency in Python and/or Java/Scala for building robust data pipelines and custom integrations.
        • Strong software engineering fundamentals including testing, documentation, and code quality.
      • Query Languages:
        • Advanced to expert-level knowledge of SQL for complex data manipulation, analysis, and performance optimization.
      • DevOps & Infrastructure:
        • Working experience with CI/CD pipelines, containerization (Docker, Kubernetes), and infrastructure as code (Terraform, CloudFormation).
        • Good understanding of DevOps best practices for managing Airflow environments (e.g., version control for DAGs, automated testing, monitoring).
    • Soft Skills
      • Strong problem-solving and analytical abilities with experience tackling complex technical challenges.
      • Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams and communicate with stakeholders at various levels.
      • Emerging leadership capabilities with the ability to mentor and guide junior engineers.
      • Ability to work independently and take ownership of projects in a fast-paced, agile environment while managing multiple priorities.
      • Good business acumen and ability to align technical solutions with business objectives.

Preferred Certifications & Experience:

  • AWS certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect) are highly desirable.
  • Snowflake certifications (e.g., SnowPro Core) are a plus.
  • Databricks certifications are beneficial.
  • Demonstrated experience with CI/CD pipelines, containerization (Docker, Kubernetes), and infrastructure as code (Terraform, CloudFormation).
  • Working knowledge of DevOps best practices for managing production data environments.
  • While familiarity with financial services data, especially private equity and alternative investments, is not necessary, it will be highly impactful for this role.

WHAT WE OFFER

We are committed to supporting your development, advancing your career, and providing benefits that matter to you. 

Our industry-leading Alter Domus Academy offers six learning zones for every stage of your career, with resources tailored to your ambitions and resources from LinkedIn Learning. 

Our global benefits also include:

  • Support for professional accreditations such as ACCA and study leave 
  • Flexible arrangements, generous holidays, plus an additional day off for your birthday!
  • Continuous mentoring along your career progression 
  • Active sports, events and social committees across our offices 
  • 24/7 support available from our Employee Assistance Program 
  • The opportunity to invest in our growth and success through our Employee Share Plan 
  • Plus additional local benefits depending on your location 

Equity in every sense of the word:

We are in the business of equity, in every sense of the word. For us, this means taking action to ensure every colleague has equal opportunity, valuing every voice and experience across our organisation, maintaining an inclusive culture where you can bring your whole self to work, and making Alter Domus a workplace where everyone feels they belong. 

We celebrate our differences, and understand that our success relies on diverse perspectives and experiences, working towards shared goals and a common purpose. We take pride in creating a workplace where all our people are empowered to be truly invested in the alternative and bring their whole selves to work.

We are committed to ensuring a welcoming recruiting and onboarding process for everyone. Please contact our hiring team if you require any accommodations to make our recruitment process more accessible for you. 

(Alter Domus Privacy notice can be reviewed via Alter Domus webpage: https://alterdomus.com/privacy-notice/)

#LI-HYBRID

Apply now

Join our talent community

Sign up today to stay up to date with our latest opportunities!

Already a member? Log in here

Interested InPlease select a category or location option. Click “Add” to create your job alert.

By signing up, I acknowledge I have read the Alter Domus privacy policy , and I wish to receive email and SMS communications. I understand I can opt-out from receiving email and SMS communications at any time.