types.h

Overview

Provides the model file types and device types supported by MindSpore Lite.

Library: libmindspore_lite_ndk.so

Since: 9

Related module: MindSpore

Summary

Types

Name Description
OH_AI_ModelType Defines model file types.
OH_AI_DeviceType Defines the supported device types.
OH_AI_NNRTDeviceType Defines NNRt device types.
OH_AI_PerformanceMode Defines performance modes of the NNRt device.
OH_AI_Priority Defines NNRt inference task priorities.
OH_AI_OptimizationLevel Defines the training optimization level.
OH_AI_QuantizationType Defines the quantization type.
NNRTDeviceDesc Defines the NNRt device information, including the device ID and device name.

Enums

Name Description
OH_AI_ModelType {
OH_AI_MODELTYPE_MINDIR = 0,
OH_AI_MODELTYPE_INVALID = 0xFFFFFFFF
}
Model file types.
OH_AI_DeviceType {
OH_AI_DEVICETYPE_CPU = 0,
OH_AI_DEVICETYPE_GPU,
OH_AI_DEVICETYPE_KIRIN_NPU,
OH_AI_DEVICETYPE_NNRT = 60,
OH_AI_DEVICETYPE_INVALID = 100
}
Supported device types.
OH_AI_NNRTDeviceType {
OH_AI_NNRTDEVICE_OTHERS = 0,
OH_AI_NNRTDEVICE_CPU = 1,
OH_AI_NNRTDEVICE_GPU = 2,
OH_AI_NNRTDEVICE_ACCELERATOR = 3
}
NNRt device types.
OH_AI_PerformanceMode {
OH_AI_PERFORMANCE_NONE = 0,
OH_AI_PERFORMANCE_LOW = 1,
OH_AI_PERFORMANCE_MEDIUM = 2,
OH_AI_PERFORMANCE_HIGH = 3,
OH_AI_PERFORMANCE_EXTREME = 4
}
Performance modes of the NNRt device.
OH_AI_Priority {
OH_AI_PRIORITY_NONE = 0,
OH_AI_PRIORITY_LOW = 1,
OH_AI_PRIORITY_MEDIUM = 2,
OH_AI_PRIORITY_HIGH = 3
}
NNRt inference task priorities.
OH_AI_OptimizationLevel {
OH_AI_KO0 = 0,
OH_AI_KO2 = 2,
OH_AI_KO3 = 3,
OH_AI_KAUTO = 4,
OH_AI_KOPTIMIZATIONTYPE = 0xFFFFFFFF
}
Training optimization levels.
OH_AI_QuantizationType {
OH_AI_NO_QUANT = 0,
OH_AI_WEIGHT_QUANT = 1,
OH_AI_FULL_QUANT = 2,
OH_AI_UNKNOWN_QUANT_TYPE = 0xFFFFFFFF }
Quantization types.