ModelInferenceType

Model inference type, determining what inference engine the algorithm model will be run on, and what hardware backend will be used to accelerate the model inference.

Entries

Link copied to clipboard

Running a QNN (Qualcomm Neural Network) model, using QNN engine on Qualcomm HTP hardware

Properties

Link copied to clipboard

Returns a representation of an immutable list of all enum entries, in the order they're declared.

Functions

Link copied to clipboard

Returns the enum constant of this type with the specified name. The string must match exactly an identifier used to declare an enum constant in this type. (Extraneous whitespace characters are not permitted.)

Link copied to clipboard

Returns an array containing the constants of this enum type, in the order they're declared.