Use the ONNX Helper library to easily access a trained machine learning (ML) model saved in the “.onnx” file format. ONNX, short for Open Neural Network Exchange, is a standardized, framework interoperable format for trained ML models.
While a standard Java library exists, it requires relatively advanced technical knowledge to use. The ONNX Helper is a free downloadable add-on library for any edition of AnyLogic (Personal Learning Edition, University, or Professional). Through a single object and a single function, it provides a simple and easy way to query ML models.
There are many cases where it’s desirable to incorporate trained machine learning (ML) models into your simulation model. The following are some concrete examples, based on our ML testbed cases:
Replacing static or distribution-based travel times with an ML model trained on real-world data. The model uses date and time inputs to predict travel duration.
Incorporating the same ML model, used in a real-world refurbishing facility to classify the repairability of arriving components, into a simulation model of the facility for increased accuracy.
Showing visually and statistically the impact and overall performance of an ML model trained to control machine speeds (e.g., using reinforcement learning) before the model is deployed in the real world.
In these types of cases, input data is retrieved and preprocessed before being used to train an ML model with one of the many available ML libraries: PyTorch, TensorFlow, caret, DL4J, and more. After an ML model is trained with data, it can be exported to a file and later called on to provide predictions (e.g., by edge devices or in simulation models).
One of these file types, with the “.onnx” extension, is from ONNX, the Open Neural Network Exchange. Its purpose is to provide an open ecosystem that helps avoid ML models being locked into one specific framework. ML models in the ONNX format can be imported and called upon from many different frameworks, and it has both cross-platform and cross-language support.
Helper Object for Seamless Integration
While there is an established ONNX Runtime Java Library, its orientation towards technical Java users provides a barrier to the more business-oriented AnyLogic users. On the contrary, the AnyLogic ONNX Helper library works by serving as an abstraction of the standard runtime library. It has built-in functionality to handle the routine code needed to interact with it. This vastly simplifies the way end users can request queries on their trained ML models, opening accessibility to anyone interested.
The object in this library – "ONNX Helper" – represents a single connection to an ONNX file (.onnx). If you have multiple ONNX files that you want to query from, you can use multiple instances of this object.
After dragging an instance of the helper object into your model, its only property is to reference your desired ONNX file (as a local or absolute path). The helper object has one basic predict function with a few variants depending on your preferences and a handful of miscellaneous functions for common operations.
If you run the simulation model, you can click on the helper object to see information about the ONNX model, including:
The filename.
The name, shape, and data type of all inputs.
The name, shape, and data type of all outputs.
For more detailed information about these topics, see the project’s GitHub repository and its Wiki.
Example models
The examples below demonstrate various use cases and levels of complexity; they are available from the GitHub repository and the internal repository of models shipped with any edition of AnyLogic (AnyLogic welcome screen → Example models → How-to models/Examples).
01
Simple Operations
As a demo of the library’s most basic functionalities, the model uses a manually constructed ONNX file with multiple inputs and outputs that use different data types. Some of the inputs/outputs also allow for variable-sized arrays. The operations on the inputs are simple: transposing, ceiling, and multiplication.
This model is a variant of the built-in example model “Supply Chain”, consisting of a retailer, a wholesaler, and a factory. It expands upon the original model’s usage of a fixed rate and distribution for the demand frequency and size by including an option to query this information based on the outputs of an ONNX model. It also includes a parameter variation experiment, allowing you to compare each option.
The provided ONNX model uses a recurrent neural network trained by fabricated real-world data to predict the time until the next order arrives in addition to its size based on the time of year.
The model depicts a simplified hospital where patients arrive, stay for some time, and then leave. Two neural networks – and thus two ONNX Helper objects – are utilized. One neural network predicts the arrival rate of patients based on the last few days’ worth of arrival rates. The second predicts patient length of stay based on 24 health-related attributes of the individual.
This is a toy model that randomly generates irises of variable sizes. For each generated flower, the provided ONNX model will attempt to classify which of the three species it is based on its petal and sepal size.
This is a toy model that lets you draw on a 28x28 grid by placing a series of straight lines. The provided ONNX model will attempt to guess what number it "sees". A subset of the Modified National Institute of Standards and Technology database was used to train the model.
Simulation for training and testing AI – Email Pack
AnyLogic simulation is the training and testing platform for AI in business. With AnyLogic general-purpose simulation, you can construct detailed and robust virtual environments for training and testing your AI models. The unique multi-method simulation capabilities provide a comprehensive tool for use in machine learning. Established in use at leading companies across industries, this fully cloud enabled platform with open API is enhancing and accelerating AI development today. Find out more about this powerful machine learning tool in our AI email pack and white paper!