Senior Data Engineer - Data Platform Engineering
--GTA--
Job Summary
We are seeking a collaborative and resourceful Senior Data Engineer to join our IT Data Platform Engineering Delivery team. You are dedicated, naturally inquisitive, and comfortable in a fast-paced environment.
This role will be part of and a member of our Information Technology Enterprise Data Services Group. You will be responsible for leading the architecture, design, analysis, and implementation in a successful and experienced team. You’ll be required to apply your depth of knowledge and expertise with both modern and legacy data platforms to develop data ecosystems that will meet business requirements and align with enterprise architecture goals and standards.
What you’ll do
- Design, build, and operationalize large-scale enterprise data solutions in Hadoop, Postgres, Oracle, and Snowflake.
- Design and develop ETL pipelines to ingest data into Oracle/Hadoop/Postgres from different data sources (Files, Mainframe, Relational Sources, NoSQL, Hadoop, etc.) using Informatica PowerCenter and BDM.
- Craft and develop solution designs for data acquisition/ingestion of multifaceted data sets (internal/external), data integrations, and data warehouse/marts.
- Collaborate with business partners, product owners, partners, functional specialists, business analysts, IT architecture, and developers to develop solution designs adhering to architecture standards.
- Ensure that solutions adhere to enterprise data governance and design standards.
- Act as a point of contact to resolve architectural, technical, and solution-related challenges from delivery teams for maximum efficiency.
- Advocate the importance of data catalogs, data governance, and data quality practices.
- Demonstrate outstanding problem-solving skills.
- Work in an Agile delivery framework to evolve data models and solution designs to deliver value incrementally.
- Be a self-starter with experience working in a fast-paced agile development environment.
- Provide strong mentoring and coaching, leading by example for junior team members.
- Be outcome-focused with strong decision-making and critical-thinking skills to challenge the status quo, impacting delivery pace and performance, and striving for efficiencies.
- Minimum 3 days a week in office.
What you’ll bring
- University degree in Computer Engineering or Computer Science.
- 8+ years of experience crafting solutions for data lakes, data integrations, data warehouses/marts.
- Solid grasp and experience with data technologies & tools (Snowflake, Hadoop, Oracle, PostgreSQL, Informatica BDM & PowerCenter, etc.).
- Excellent coding skills in SQL Scripts and Java.
- Knowledge or experience in GW CC (Guidewire Claim Center) will be an additional nice-to-have.
- Outstanding knowledge and experience in ETL with the Informatica product suite.
- Experience implementing Data Governance principles and efficiencies.
- Familiarity with the Agile software development methodology.
- Excellent verbal and written communication skills.
- Insurance knowledge is an asset; ability to foundationally understand complex business processes driving technical systems.