New Job Senior Data Engineer In Manitoba

Senior Data Engineer
Senior Data Engineer

Senior Data Engineer

Company : Wellington-Altus Private Wealth
Salary : Details not provided
Location : Manitoba

Full Description

Senior Data Engineering


Location: This position will be located in our Toronto or Winnipeg offices.


Application Deadline: Sept 30th, 2022


Our organization:


Founded in 2017, Wellington-Altus Financial Inc. (Wellington-Altus) is the parent company to Wellington-Altus Private Counsel and Wellington-Altus Private Wealth —the top-rated* wealth advisory company in Canada and one of Canada's Best Managed Companies. With more than $20 billion in assets under administration and offices across the country, Wellington-Altus identifies with successful, entrepreneurial advisors and portfolio managers, and their high-net-worth clients.

  • Investment Executive 2022 Brokerage Report Card

The opportunity:


Reporting to the Senior Manager, Data Engineering, the Senior Data Engineer will play a role in the data team which will look to the incumbent for advice on modern development concepts, data concerns, and innovative solutions to them.


The Senior Data Engineer will have a company-wide view of the Data Engineering solutions that they build in this role and will consistently think in terms of automating or expanding the results company-wide. The incumbent will have opportunity to help design and build our data application and infrastructure platforms while working with emerging technologies and associated AWS cloud services while influencing business intelligence solutions end-to-end: business requirements, workflow instrumentation, data modeling and ETL.


Additionally, the Senior Data Engineer should be experienced at designing, implementing, and operating stable, scalable, low-cost solutions to flow data from production systems into the data lake and into end-user facing applications. The incumbent should understand enterprise information systems, possess a strong business sense, and will participate on a team to put these skills into action.


Key responsibilities include:

  • Creating and maintaining optimal data pipeline architecture using Python, AWS Glue and Lambda.
  • Assembling large, complex data sets that meet functional / non-functional business requirements.
  • Identifying, designing, and implementing internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Building the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL (on MS SQL and/or Redshift) and AWS “Big Data” technologies such as AWS Glue and Lambda.
  • Building analytical tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Working with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keeping our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Working with data and analytics experts to strive for greater functionality in our data systems.
  • Designing, building and reusing DevSecOps pipelines ensuring repeatable tasks are automated by working the DevOps Team.
  • Delivery Management
    • Combining your technical expertise and problem-solving passion to work closely with project teams, turning complex ideas into end-to-end solutions that transform our clients’ business.
    • Contributing to the design, development and delivery of large-scale data systems, data processing and data transformation projects that deliver business value for clients.
    • Conducting technical feasibility assessments and providing project estimates for the design and development of the solution.
    • Providing technical inputs to agile processes, such as epic, story and task definitions to resolve issues and remove barriers throughout the lifecycle of client engagements
  • Environment Management
    • Creation and maintenance of infrastructure-as-code for cloud, on-premise, and hybrid environments using tools such as AWS Cloud services
    • Creation and maintenance of automated code deployments
  • Leadership
    • Supporting team members in challenging situations (resource / release challenges, cross-team communication)
    • Scheduling ad-hoc meetings to solve for disagreements or ambiguities to reduce risk of jeopardizing milestones and/or artifacts.
    • Easing conversations and promoting constructive attitudes.
    • Facilitating and ensuring healthy communication between project team members.
    • Accountable for conducting regular retros and addressing team’s feedback.
    • Understanding team strengths and weaknesses to improve each phase of the project lifecycle.
  • Performing other duties as assigned

The ideal candidate will possess:


  • A bachelor's degree in computer science, information management or equivalent experience in a related field
  • AWS and/or Azure certification or equivalent experience in a related field
  • 3-5 years’ experience in data engineering design, development, architecting and delivering on-premise and in the cloud using technologies such as Python, AWS Glue and Lambda
  • Experience working within a financial services institution
  • Experience building data products incrementally, and integrating and managing datasets from multiple sources
  • Experience with AWS tools and technologies (Redshift, S3, EC2, etc.)
  • Intermediate knowledge and expertise with data modelling skills, advanced SQL knowledge and expertise with SQL Server, Oracle, MySQL, DB2, and columnar databases
  • Demonstrated industry experience in the fields of Data Warehousing, Data Science and Big Data processing.
  • Knowledge of software development practices, concepts/methodologies (i.e. Waterfall, Agile, Iterative), and related technologies obtained through formal training and/or work experience
  • Knowledge of one or more requirements analysis and problem decomposition techniques
  • Understanding of industry standards and standard business capabilities
  • Expert verbal & written communication skills; additionally, expert analytical, problem-solving, and influence skills
  • An expert ability to collaborate and act as part of a team, with a focus on cross-group collaboration
  • Development/implementation experience in AWS using Python, Glue, S3, Redshift, API Gateway, Lambda, and other managed services such as AWS Transfer for SFTP.
  • Development/implementation experience with database, API and file-based system integration, including various semi-structured and unstructured data formats (CSV, JSON, XML, EBCDIC, etc.)
  • Experience with Data Warehousing/data modelling concepts (star/snowflake schema, SCD Type 1 and 2.)
  • Experience with performance tuning and troubleshooting ETL pipelines.
  • Experience with DevOps, DataOps and DevSecOps best practices using GitLab.
  • Experience with ServiceNow, Jira and Confluence.


Conditions of employment:


  • Must be legally eligible to work in Canada
  • A background check, satisfactory to the employer, may be required of the successful applicant prior to commencing employment

Wellington-Altus is strongly committed to equity and diversity within its community and welcomes applications from women, racialized persons, Indigenous peoples, persons with disabilities, and persons of all sexual orientations and genders. All qualified individuals who would contribute to the further diversification of our organization are encouraged to apply.


If you require accommodation for the recruitment process, please let us know at the point of application.


To apply:


Click the Apply For This Job button to submit your resume, cover letter and salary expectations. You will be contacted if you are selected for an interview. More information about working at Wellington-Altus can be found on our website at www.wellington-altus.com.