Skip Navigation
Loading...

Data Engineer



Technology

Vancouver, BC
 • 
ID: 858-028
 • 
Full-Time/Regular

The Organization

LGM is a national leader in providing warranty, finance and insurance services to the Canadian automotive industry. Since 1998, LGM has partnered with leading automotive manufacturers and dealerships across Canada to deliver award-winning F&I solutions. Dealer partnerships are complemented with the strong backing and support of their automotive manufacturing brands, which include BMW/MINI, Kia, Mazda, Volvo, Jaguar/Land Rover, Mitsubishi Motors, Polestar and Motorrad.  

 

The Job

The Data Engineer will play a key role in modernizing and scaling LGM’s data platform using Microsoft Fabric and Azure within our Business Intelligence team.

 

Key Responsibilities

  • Develop, optimize, and operate resilient data pipelines for ingestion, replication/CDC, and transformation across cloud and on-Prem sources, leveraging Microsoft Fabric and/or Azure data services to meet defined SLAs (batch and near real-time)prem sources, leveraging Microsoft Fabric and/or Azure data services to meet defined SLAs (batch and near real-time) 

  • Implement modern lakehouse/warehouse patterns using Medallion architecture including data quality checks, lineage/metadata, and standardized reusable data products. 

  • Design, test, and implement dimensional data models (star schemas, conformed dimensions, fact tables, aggregates) to support enterprise reporting, self-service analytics, and governed metric definitions. 

  • SQL engineering and performance optimization across relational platforms (e.g., SQL Server/Azure SQL/Fabric Warehouse SQL), including query tuning, troubleshooting production issues, and improving data structures for reliability and scale 

  • Enable analytics and reporting by publishing curated datasets and semantic models, supporting Power BI development best practices (performance, incremental refresh patterns, RLS/OLS, reusable measures) and contributing to migration away from legacy reporting (e.g., SSRS) where applicable 

  • Ingest and curate semi-structured and unstructured data (JSON, APIs, logs, files), managing schema evolution, validation, and scalable storage formats. 

  • Collaborate on enterprise data governance: data definitions, data contracts, documentation, cataloging, and stewardship practices to ensure consistency and trusted data across domains 

  • Be highly responsive to critical production issues providing timely and effective solutions. 

  • Participate in code reviews both as a reviewer and as a reviewee, in a respectful way that facilitates skill building for all team members. 

  • Engage in all aspects of the Agile process, proactively contributing to improvements in the processes to minimize rework/waste and increase quality and velocity. 

  • Keep abreast of software industry best practices, processes, and technologies.  

 

Core Competencies

  • Communication – Able to clearly and articulately present information in both spoken and written word. 

  • Collaboration – Develops positive relationships with others to build consensus, morale and commitment to goals and objectives. 

  • Innovation – Displays the ability to think outside of the box to develop creative and new solutions that meets current and future needs. 

  • Flexibility – Easily adapts to changing environment and resources. 

  • Productivity – Strives to consistently achieve excellence in all tasks and goals. 

  • Accountability – Takes personal ownership and responsibility for the quality and timeliness of work commitments and decisions. 

 

Required Skills

  • Strong experience in data engineering, ETL/ELT, and data warehousing, including dimensional modeling and delivering curated data marts/data products. 

  • Advanced TSQL (TransactSQL) skills for development, troubleshooting, and maintenance of legacy ETL processes and data pipelines (e.g., SSIS/SQL Server–based workloads). 

  • Experience with Microsoft cloud data platforms, with preference for Microsoft Fabric and/or Azure services such as OneLake, Lakehouse/Warehouse, Data Pipelines, Dataflows Gen2, Notebooks/Spark, Mirroring 

  • Experience enabling Power BI at scale, including semantic model fundamentals (measures, relationships, performance patterns) and governance practices (certification, shared datasets, workspace standards). SSRS experienceAbility to think creatively. 

 

Education

  • Post-secondary education in Computer Science or related discipline 

 

Experience

  • 3+ years development and maintenance of Data Warehouses and ETL processes 

 

Why LGM

  • Compensation Range:  $100,000-$118,000 per year base plus Corporate Variable Pay (LGM's variable bonus pay program)

  • Hybrid work model (2 days in the office)

  • Comprehensive compensation package including:

    • Extended health benefits plan

    • Group RRSP

    • Performance bonus

    • Health & wellness benefits

    • Education sponsorship

  • Work-life balance perks:

    • Four paid days annually to “give back” to the community

    • Your birthday off — every year

  • Vehicle rebate program: Up to $400 per month

 
 

close