Bring your models alongside your data using FABRIQ’s native platform integrations.
- Import models as code, libraries, or trained model artefacts from ALGOREUS
- Access versioning, branching, reproducibility, and lineage capabilities of FABRIQ
Tie your models back to the processes that drive your organisation with the FABRIQ Ontology.
- Define a robust foundation for AI-powered end-user workflows, with granular security and governance
- Release and inject your models directly into core applications, without adapters or glue-code
- Build feature-rich compound applications in hours instead of months
Enrich deployed models with decision data from your organisation’s analysts, operators, and decision-makers.
- Facilitate collaboration between AI/ML and operations teams through shared applications
- Enable operators to monitor, retrain, and improve your models with real-time feedback
- Automatically write decision data back to both the Ontology and corresponding systems of action
Build Models in ALGOREUS with FABRIQ
ALGOREUS implements the complete model lifecycle, spanning problem definition, development of one or more candidate solutions, evaluation of these solutions, deployment, monitoring, and iteration. Modelling Objectives provide the backbone of the model lifecycle for any problem. Core FABRIQ functionality extends the traditional model lifecycle upstream (i.e., data enrichment and management) and downstream (i.e., operationalization and feedback). The combination of end-to-end capability and interoperability means that FABRIQ customers can continue to leverage investments that are already working for them, while using ALGOREUS to augment all machine-learning challenges and provide intelligent decision precision for their enterprise operations.
Enrich and manage data
The fuel for any modeling use case is data. This includes not only data ingested from multiple source systems, but also data derived and captured throughout the model and use case lifecycle — business logic, feature sets, labels, model predictions, instance-level outcomes, end-user actions/decisions, and more. FABRIQ provides the tools to not just integrate data from anywhere, but also transform, enrich, permission, catalog, quality-control, govern, and maintain it. This is accomplished through an entire platform-wide approach such as:
- Lineage which spans across datasets, logic, models, and actions to enable build automation, security, transparency, and downstream attribution. It versions data and code consistently, to simulate how logic changes impact downstream features, metrics, and model-driven decisions.
- Monitoring of data health, distributions, model feedback, and pipeline health, to enable AI delivery at scale.
Evaluate and manage models
Model evaluation and management capabilities, rooted in ALGOREUS Model Monitor, are critical to streamlining and assuring successful, ongoing operationalization of modelling projects, whether by production pipelines, user-facing applications, or other systems. It serves as a searchable catalog for model candidates, capturing models, versions, and event-specific metadata per-submission. It then enables problem solvers to:
- Explore the set of model submissions along various metadata dimensions and model metrics
- Define model standards, and set up rails and governance around required reviews from various stakeholders, as well as release and deployment processes
- Integrate ALGOREUS’ model inputs and outputs to the FABRIQ Ontology, enabling connectivity with operational applications and event scenarios
- Implement a systematic testing and evaluation (T&E) plan via software, leveraging managed metrics
- Perform continuous integration and continuous deployment (CI/CD) of models
Operationalize models
The ultimate goal of a modelling workflow is often deploying models to users and systems that drive decisions and actions. ALGOREUS provides a variety of options for deploying models directly, as well as tools for seamlessly incorporating model deployments into production use. Models can be deployed into managed batch inference pipelines, interactively-queryable “Live” API endpoints, or even external systems (e.g., enterprise on-prem systems, or edge hardware, or multi-cloud). Pipeline-based batch deployments are suitable for recurring large-scale processing, and benefit from FABRIQ’s data enrichment and management capabilities described earlier, such as versioning, orchestration, health checks, and lineage. Their outputs can be consumed by downstream pipelines and applications or synced to external systems. Deployments are especially powerful when models are bound to the FABRIQ Ontology. Integrations enable user-facing applications to:
- Leverage direct and linked properties of input objects, execute Ontology-based scenario analyses using the model, and incorporate the model into broader simulations.
- Perform actions that propose or commit real operational changes
- Capture actions and feedback as new data via writeback. This provides business and modelling teams a powerful data asset for monitoring, understanding, and improving production performance, as well as identifying and adapting to new circumstances
Collectively, these enable rapid construction and iteration of data-powered workflows and processes that are robust enough for an enterprise's critical path, while closing the loop with model development, evaluation, and management.