Skip to Content

Senior Enterprise Data Architect

--GTA--

Role Purpose:

The purpose of the Data Architect role is to:

  • Architect and implement advanced data solutions using Snowflake on AWS, ensuring scalable, secure, and high-performance data environments.
  • Migration of the existing Data Warehouse solution to Snowflake.
  • Technology platform evaluations in the data and analytics space.
  • Collaborate with cross-functional teams (data engineers, AI engineers, business, solution architects) to translate business requirements into technical solutions aligned with the organization’s data strategy.
  • Ensure data governance, security, and compliance within the Snowflake ecosystem, adhering to regulatory and organizational standards.

Experience and Capabilities:

  • Extensive experience (8+ years) in data architecture and engineering, with a proven track record in large-scale data transformation programs, ideally in insurance or financial services.
  • Proven experience in architecting and implementing advanced data solutions using Snowflake on AWS.
  • Expertise in design and orchestrating data acquisition pipelines using AWS Glue for ETL/ELT, Snowflake OpenFlow and Apache Airflow for workflow automation, enabling seamless ingestion of different data from diverse sources.
  • Proven experience in DBT to manage and automate complex data transformations within Snowflake, ensuring modular, testable, and version-controlled transformation logic.
  • Experience in implementing the lake house solution, Medallion architecture for financial or insurance carriers.
  • Experience in optimizing and tuning Snowflake environments for performance, cost, and scalability, including query optimization and resource management.
  • Experience in architecting/leading migration of workloads from Cloudera to Snowflake.
  • Design Streamlit apps and define new capabilities and data products leveraging Snowflake ML and LLOPS capabilities.
  • Experience in evaluating the data technology platform including data governance suites, data security products.
  • Exposure to enterprise Datawarehouse solutions like Cloudera, AWS Redshift and Informatica tool sets – IDMC, PowerCenter, BDM.
  • Develop robust data models and data pipelines to support data transformation, integrating multiple data sources and ensuring data quality and integrity.
  • Document architecture, data flows, and transformation logic to ensure transparency, maintainability, and knowledge sharing across teams.
  • Strong knowledge of data lifecycle management, data retention, data modeling, and working knowledge of cloud computing and modern development practices.
  • Experience with data governance, metadata management, and data quality frameworks (e.g., Collibra, Informatica).
  • Experience in converting policy/data conversion from legacy to modern platform.
  • Deep expertise in Snowflake (SnowPro Advanced certification preferred), with hands-on experience delivering Snowflake as an enterprise capability.
  • Hands-on experience with AWS Glue for ETL/ELT, Apache Airflow for orchestration, and dbt for transformation (preferably deployed on AWS ECS).
  • Proficiency in SQL, data modeling, ETL/ELT processes, and scripting languages (Python/Java).
  • Familiarity with data mesh principles, data product delivery, and modern data warehousing paradigms.