IT - Application & Software Development
Oct 06, 2022
Before going through the qualifications please understand that we are flexible! If you feel you have other expertise/experience that would make you a good fit, please apply, and tell us why in a cover letter.
- Knowledge of Data Management standard methodologies and frameworks (eg. DAMA), with experience in data analysis processes & tools.
- Knowledge and expertise with conceptual modelling, metadata, reference data management and master data management (MDM), data quality and analysis.
- Knowledge of software engineering practices for SDLC, including coding standards, code reviews, source control management, build processes, testing, and operations.
- Eager to learn and build both technical and analytical skills.
- Self-starter with ability to work independently and balance multiple projects simultaneously.
- Ability to prioritize, run, and handle work independently in a fast-paced environment.
- Strong C# programming skills in a large system/application environment including the practical knowledge of availability, scalability, multi-threaded development, and performance design patterns.
- Strong competent understanding of RESTful, SOAP, GraphQL, WebSocket services.
- Extensive experience with various database technologies, such as RDBMS (MS SQL), NoSQL (Elastic Search), Graph (Neo4j).
- Experience with Messaging Services such as RabbitMQ
- Experience with data cleansing tools such as OpenRefine
- Knowledge of Web Services, XML technologies, SOA, and Enterprise Integration Patterns.
- Experience with building data ingestion pipelines both real-time and batch using standard methodologies.
- Experience using source control technology such as GitHub.
- Familiarity with CI/CD orchestration tools such as Azure DevOps, GitHub Actions, Argo CD, Maven or similar.
- Experience with Docker and Kubernetes an asset.
is a professional sports and commercial real estate company based in Toronto, Ontario.
- Maintain our client’s Data Fabric using CluedIn.
- Ingest data into platform using CluedIn’s .NET Core Crawler framework.
- Create and maintain entities, edges, and vocabulary using a graph modelling framework.
- Normalize, standardize, harmonize, and clean the data for further aggregation and analysis, using CluedIn Clean.
- Implement data enrichers to further improve the data quality.
- Aggregate data using GraphQL, and stream out for various uses such as, reporting, analysis, and insert golden record into the CRM ecosystem.
- Assist with developing roadmaps for comprehensive data solutions to address reporting, analysis, and decision-making needs. This includes standard methodologies for data mapping and data integration processes to improve existing systems.
- Partners with other teams like infrastructure, CRM, and business intelligence to define technical solutions, and efficient integrations within existing landscape.