model.h

Overview

Provides model-related APIs for model creation and inference.

Library: libmindspore_lite_ndk.so

Since: 9

Related module: MindSpore

Summary

Structs

Name Description
OH_AI_TensorHandleArray Defines the tensor array structure, which is used to store the tensor array pointer and tensor array length.
OH_AI_ShapeInfo Defines dimension information. The maximum dimension is set by MS_MAX_SHAPE_NUM.
OH_AI_CallBackParam Defines the operator information passed in a callback.

Macros

Name Description
OH_AI_MAX_SHAPE_NUM 32 Defines the maximum tensor dimension.

Types

Name Description
OH_AI_ModelHandle Defines the pointer to a model object.
OH_AI_TrainCfgHandle Defines the pointer to a training configuration object.
OH_AI_TensorHandleArray Defines the tensor array structure, which is used to store the tensor array pointer and tensor array length.
OH_AI_ShapeInfo Defines the dimension information. The maximum dimension is set by MS_MAX_SHAPE_NUM.
OH_AI_CallBackParam Defines the operator information passed in a callback.
OH_AI_KernelCallBack) (const OH_AI_TensorHandleArray inputs, const OH_AI_TensorHandleArray outputs, const OH_AI_CallBackParam kernel_Info) Defines the pointer to a callback.

Functions

Name Description
OH_AI_ModelCreate () Creates a model object.
OH_AI_ModelDestroy (OH_AI_ModelHandle *model) Destroys a model object.
OH_AI_ModelBuild (OH_AI_ModelHandle model, const void *model_data, size_t data_size, OH_AI_ModelType model_type, const OH_AI_ContextHandle model_context) Loads and builds a MindSpore model from the memory buffer.
OH_AI_ModelBuildFromFile (OH_AI_ModelHandle model, const char *model_path, OH_AI_ModelType model_type, const OH_AI_ContextHandle model_context) Loads and builds a MindSpore model from a model file.
OH_AI_ModelResize (OH_AI_ModelHandle model, const OH_AI_TensorHandleArray inputs, OH_AI_ShapeInfo *shape_infos, size_t shape_info_num) Adjusts the input tensor shapes of a built model.
OH_AI_ModelPredict (OH_AI_ModelHandle model, const OH_AI_TensorHandleArray inputs, OH_AI_TensorHandleArray *outputs, const OH_AI_KernelCallBack before, const OH_AI_KernelCallBack after) Performs model inference.
OH_AI_ModelGetInputs (const OH_AI_ModelHandle model) Obtains the input tensor array structure of a model.
OH_AI_ModelGetOutputs (const OH_AI_ModelHandle model) Obtains the output tensor array structure of a model.
OH_AI_ModelGetInputByTensorName (const OH_AI_ModelHandle model, const char *tensor_name) Obtains the input tensor of a model by tensor name.
OH_AI_ModelGetOutputByTensorName (const OH_AI_ModelHandle model, const char *tensor_name) Obtains the output tensor of a model by tensor name.
OH_AI_TrainCfgCreate () Creates the pointer to the training configuration object. This API is used only for on-device training.
OH_AI_TrainCfgDestroy (OH_AI_TrainCfgHandle *train_cfg) Destroys the pointer to the training configuration object. This API is used only for on-device training.
OH_AI_TrainCfgGetLossName (OH_AI_TrainCfgHandle train_cfg, size_t *num) Obtains the list of loss functions, which are used only for on-device training.
OH_AI_TrainCfgSetLossName (OH_AI_TrainCfgHandle train_cfg, const char **loss_name, size_t num) Sets the list of loss functions, which are used only for on-device training.
OH_AI_TrainCfgGetOptimizationLevel (OH_AI_TrainCfgHandle train_cfg) Obtains the optimization level of the training configuration object. This API is used only for on-device training.
OH_AI_TrainCfgSetOptimizationLevel (OH_AI_TrainCfgHandle train_cfg, OH_AI_OptimizationLevel level) Sets the optimization level of the training configuration object. This API is used only for on-device training.
OH_AI_TrainModelBuild (OH_AI_ModelHandle model, const void *model_data, size_t data_size, OH_AI_ModelType model_type, const OH_AI_ContextHandle model_context, const OH_AI_TrainCfgHandle train_cfg) Loads a training model from the memory buffer and compiles the model to a state ready for running on the device. This API is used only for on-device training.
OH_AI_TrainModelBuildFromFile (OH_AI_ModelHandle model, const char *model_path, OH_AI_ModelType model_type, const OH_AI_ContextHandle model_context, const OH_AI_TrainCfgHandle train_cfg) Loads the training model from the specified path and compiles the model to a state ready for running on the device. This API is used only for on-device training.
OH_AI_RunStep (OH_AI_ModelHandle model, const OH_AI_KernelCallBack before, const OH_AI_KernelCallBack after) Defines a single-step training model. This API is used only for on-device training.
OH_AI_ModelSetLearningRate (OH_AI_ModelHandle model, float learning_rate) Sets the learning rate for model training. This API is used only for on-device training.
OH_AI_ModelGetLearningRate (OH_AI_ModelHandle model) Obtains the learning rate for model training. This API is used only for on-device training.
OH_AI_ModelGetWeights (OH_AI_ModelHandle model) Obtains all weight tensors of a model. This API is used only for on-device training.
OH_AI_ModelUpdateWeights (OH_AI_ModelHandle model, const OH_AI_TensorHandleArray new_weights) Updates the weight tensors of a model. This API is used only for on-device training.
OH_AI_ModelGetTrainMode (OH_AI_ModelHandle model) Obtains the training mode.
OH_AI_ModelSetTrainMode (OH_AI_ModelHandle model, bool train) Sets the training mode. This API is used only for on-device training.
OH_AI_ModelSetupVirtualBatch (OH_AI_ModelHandle model, int virtual_batch_multiplier, float lr, float momentum) Sets the virtual batch for training. This API is used only for on-device training.
OH_AI_ExportModel (OH_AI_ModelHandle model, OH_AI_ModelType model_type, const char *model_file, OH_AI_QuantizationType quantization_type, bool export_inference_only, char **output_tensor_name, size_t num) Exports a training model. This API is used only for on-device training.
OH_AI_ExportModelBuffer (OH_AI_ModelHandle model, OH_AI_ModelType model_type, char **model_data, size_t *data_size, OH_AI_QuantizationType quantization_type, bool export_inference_only, char **output_tensor_name, size_t num) Exports the memory cache of the training model. This API is used only for on-device training.
OH_AI_ExportWeightsCollaborateWithMicro (OH_AI_ModelHandle model, OH_AI_ModelType model_type, const char *weight_file, bool is_inference, bool enable_fp16, char **changeable_weights_name, size_t num) Exports the weight file of the training model for micro inference. This API is used only for on-device training.