856 views
3 applications 0 responses
Questrade opened its doors in 1999 and is still one of Canada's fastest-growing online brokerages. It offers a great balance of platforms, a range of markets to trade, and services including multiple account types.
The first product was a platform that allows people to do trading, then the company added a couple of new business lines, and the company is on the way to the full digital banking solution.
The main office is in Toronto.
Right now we are looking for Senior Data Engineer (GCP) on a full-time remote basis.
Strong Must-Have Skills:
● At least 5 or more years of experience working in the data engineering field;
● Very Strong GCP data engineering experience working with BigQuery, Dataflow, Cloud Data Fusion, Dataprep, Data Catalog, Cloud Composer and
CloudSQL Functions;
● Experience in architecting, developing software, or Internet-scale production-grade big data solutions in virtualized environments;
● Experience in working with/on data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and
reporting/analytic tools, environments, and data structures;
● Design, document, and develop complex MS Power Platform (PowerBI, Power Apps, etc.) solutions;
● Work with UX/UI designers to construct wireframes, mock-ups, and in an agile development process creates Power BI Embedded reports;
● Work closely with our application solution architects, data scientists, and data engineering teams to develop BI solutions, and enhance and/or define new data integrations and/or modeling efforts. Monitor, troubleshoot, and resolve issues for the deployed reporting solutions;
● Ability to work, communicate effectively and influence stakeholders on external engineering teams, product development teams, Business stakeholders and external partners;
● Experience working on multiple projects simultaneously while troubleshooting technical issues and working with cross-functional stakeholders;
● Creating and maintaining optimal data architecture pipelines;
● Good Knowledge in SQL language;
● Good Knowledge in Message Broker systems (e.g., Kafka, PubSub);
● Good Knowledge of Python language;
● Develop and maintain code and documentation for ETL and other data
integration projects and procedures;
● Collaborate with the team to decide on which tools and strategies to use
within specific data integration scenarios;
● Demonstrable track record of dealing well with ambiguity, prioritizing needs,
and delivering results in a dynamic environment.
Good to have skills:
● SSIS, SSRS;
● Knowledge of NoSQL DB.
Nice to have skills:
● Knowledge of the Financial industry (Banking/Insurance/Mortgages);
● Experience building applications using SparkML and TensorFlow;
● GCP Data Engineering Certification preferred.
Let’s discuss more details 😊