在Ascend昇腾硬件用npu加速paddleLite版本ocr(nnadapter)

在Ascend昇腾硬件用npu加速paddleLite版本ocr(nnadapter)

  • 参考文档
    • * nnadapter参考文档地址
    • * 华为昇腾 NPU参考文档地址
    • * PaddleLite的C++API参考文档
  • 一.确保cpu版本运行正常
  • 二.编译Ascend上npu加速库
  • 三.跑通npu加速版本Demo
    • 1.Demo下载地址
    • 2.参考手册网址
    • 3.改脚本run.sh
      • (1).改参数HUAWEI_ASCEND_TOOLKIT_HOME
      • (2).改参数NNADAPTER_DEVICE_NAMES,NNADAPTER_MODEL_CACHE_DIR,NNADAPTER_MODEL_CACHE_TOKEN
    • 4.完整log日志
      • 不包含转nnc模型的日志
      • 包含转nnc模型的日志
  • 四.将ocr代码改成nnadapter与ascendnpu格式
  • 五.必须设置环境变量!
    • 1.写入环境变量
    • 2.脚本里写进临时环境变量
  • 五.部分模型不能成功转成nnc格式加速的问题
  • 六.考虑其他办法

参考文档

* nnadapter参考文档地址

https://www.paddlepaddle.org.cn/inference/develop_guides/nnadapter.html#id3

* 华为昇腾 NPU参考文档地址

https://www.paddlepaddle.org.cn/lite/develop/demo_guides/huawei_ascend_npu.html#npu

* PaddleLite的C++API参考文档

https://www.paddlepaddle.org.cn/lite/develop/api_reference/cxx_api_doc.html#place

一.确保cpu版本运行正常

参考我之前的教程
http://t.csdnimg.cn/Nd4Qv

二.编译Ascend上npu加速库

在这里插入图片描述
之前的教程有提到过编译ascend_npu加速库失败,后来发现是paddleLite与ascend-toolkit版本不兼容的问题,所以导致未知原因的编译失败,这次用到的是2.10版本的paddleLite与5.0.2.alpha003的toolkit最后编译成功
在这里插入图片描述

三.跑通npu加速版本Demo

1.Demo下载地址

https://paddlelite-demo.bj.bcebos.com/devices/generic/PaddleLite-generic-demo.tar.gz

2.参考手册网址

https://www.paddlepaddle.org.cn/lite/develop/demo_guides/huawei_ascend_npu.html#npu-paddle-lite

3.改脚本run.sh

(1).改参数HUAWEI_ASCEND_TOOLKIT_HOME

注意将HUAWEI_ASCEND_TOOLKIT_HOME这个参数的地址改成自己的昇腾toolkit地址,不然不能调用npu加速

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:.:../../libs/PaddleLite/$TARGET_OS/$TARGET_ABI/lib:../../libs/PaddleLite/$TARGET_OS/$TARGET_ABI/lib/$NNADAPTER_DEVICE_NAMES:../../libs/PaddleLite/$TARGET_OS/$TARGET_ABI/lib/cpu
if [ "$NNADAPTER_DEVICE_NAMES" == "huawei_ascend_npu" ]; then
    HUAWEI_ASCEND_TOOLKIT_HOME="/home/HwHiAiUser/Ascend/ascend-toolkit/5.0.2.alpha003" #"/usr/local/Ascend/ascend-toolkit/6.0.RC1.alpha003"
    if [ "$TARGET_OS" == "linux" ]; then
      if [[ "$TARGET_ABI" != "arm64" && "$TARGET_ABI" != "amd64" ]]; then
        echo "Unknown OS $TARGET_OS, only supports 'arm64' or 'amd64' for Huawei Ascend NPU."
        exit -1
      fi
    else
      echo "Unknown OS $TARGET_OS, only supports 'linux' for Huawei Ascend NPU."
      exit -1
    fi
    NNADAPTER_CONTEXT_PROPERTIES="HUAWEI_ASCEND_NPU_SELECTED_DEVICE_IDS=0"
    export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib64:$HUAWEI_ASCEND_TOOLKIT_HOME/fwkacllib/lib64:$HUAWEI_ASCEND_TOOLKIT_HOME/acllib/lib64:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/lib64:$HUAWEI_ASCEND_TOOLKIT_HOME/opp/op_proto/built-in
    export PYTHONPATH=$PYTHONPATH:$HUAWEI_ASCEND_TOOLKIT_HOME/fwkacllib/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/acllib/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/toolkit/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/pyACL/python/site-packages/acl
    export PATH=$PATH:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/ccec_compiler/bin:${HUAWEI_ASCEND_TOOLKIT_HOME}/acllib/bin:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/bin
    export ASCEND_AICPU_PATH=$HUAWEI_ASCEND_TOOLKIT_HOME
    export ASCEND_OPP_PATH=$HUAWEI_ASCEND_TOOLKIT_HOME/opp
    export TOOLCHAIN_HOME=$HUAWEI_ASCEND_TOOLKIT_HOME/toolkit
    export ASCEND_SLOG_PRINT_TO_STDOUT=1
    export ASCEND_GLOBAL_LOG_LEVEL=1
fi

(2).改参数NNADAPTER_DEVICE_NAMES,NNADAPTER_MODEL_CACHE_DIR,NNADAPTER_MODEL_CACHE_TOKEN

NNADAPTER_DEVICE_NAMES="huawei_ascend_npu"

NNADAPTER_MODEL_CACHE_DIR是装nnc模型的文件夹,最开始没有生成nnc模型的时候就只用写指定生成模型地址,
NNADAPTER_MODEL_CACHE_TOKEN=“null”,在保存nnc模型后,将NNADAPTER_MODEL_CACHE_TOKEN参数写为生成后的模型名字

在这里插入图片描述

在这里插入图片描述

4.完整log日志

发现调用npu是用了这样一个过程,先将nb格式的模型转成,nnc格式,然后再npu调用这个nnc格式模型进行加速推理,如果已经转过nnc格式保存在相应文件夹,就会直接调用这个nnc模型,省略了这个转模型的步骤,直接进行推理

不包含转nnc模型的日志

huawei_ascend_npu null /home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/cache 54de7d70d80b9883fb1f8d10f70de40e null
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 268----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 284----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 288----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 300----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 305----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 307----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 371----------------
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1118 Setup] ARM multiprocessors name: 
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1119 Setup] ARM multiprocessors number: 8
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1121 Setup] ARM multiprocessors ID: 0, max freq: 0, min freq: 0, cluster ID: 0, CPU ARCH: A55
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1121 Setup] ARM multiprocessors ID: 1, max freq: 0, min freq: 0, cluster ID: 0, CPU ARCH: A55
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1121 Setup] ARM multiprocessors ID: 2, max freq: 0, min freq: 0, cluster ID: 0, CPU ARCH: A55
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1121 Setup] ARM multiprocessors ID: 3, max freq: 0, min freq: 0, cluster ID: 0, CPU ARCH: A55
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1121 Setup] ARM multiprocessors ID: 4, max freq: 0, min freq: 0, cluster ID: 0, CPU ARCH: A55
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1121 Setup] ARM multiprocessors ID: 5, max freq: 0, min freq: 0, cluster ID: 0, CPU ARCH: A55
[I 11/16  2:59: 7.580 ...lic/paddle-lite/lite/core/device_info.cc:1121 Setup] ARM multiprocessors ID: 6, max freq: 0, min freq: 0, cluster ID: 0, CPU ARCH: A55
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1121 Setup] ARM multiprocessors ID: 7, max freq: 0, min freq: 0, cluster ID: 0, CPU ARCH: A55
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1127 Setup] L1 DataCache size is: 
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1129 Setup] 32 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1129 Setup] 32 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1129 Setup] 32 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1129 Setup] 32 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1129 Setup] 32 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1129 Setup] 32 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1129 Setup] 32 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1129 Setup] 32 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1131 Setup] L2 Cache size is: 
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1133 Setup] 512 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1133 Setup] 512 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1133 Setup] 512 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1133 Setup] 512 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1133 Setup] 512 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1133 Setup] 512 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1133 Setup] 512 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1133 Setup] 512 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1135 Setup] L3 Cache size is: 
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1137 Setup] 0 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1137 Setup] 0 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1137 Setup] 0 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1137 Setup] 0 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1137 Setup] 0 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1137 Setup] 0 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1137 Setup] 0 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1137 Setup] 0 KB
[I 11/16  2:59: 7.581 ...lic/paddle-lite/lite/core/device_info.cc:1139 Setup] Total memory: 7952452KB
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 398----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 399----------------
[4 11/16  2:59: 7.619 ...e-lite/lite/model_parser/model_parser.cc:780 LoadModelNaiveFromFile] Meta_version:2
[4 11/16  2:59: 7.619 ...e-lite/lite/model_parser/model_parser.cc:873 LoadModelFbsFromFile] Opt_version:9dd4e2655
[4 11/16  2:59: 7.619 ...e-lite/lite/model_parser/model_parser.cc:888 LoadModelFbsFromFile] topo_size: 74904
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.621 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.622 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.623 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.624 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.625 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:226 InputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:247 OutputArgumentNames] This function call is expensive.
[5 11/16  2:59: 7.626 .../lite/model_parser/flatbuffers/op_desc.h:272 AttrNames] This function call is expensive.
[4 11/16  2:59: 7.673 ...e-lite/lite/model_parser/model_parser.cc:803 LoadModelNaiveFromFile] paddle_version:0
[4 11/16  2:59: 7.673 ...e-lite/lite/model_parser/model_parser.cc:804 LoadModelNaiveFromFile] Load naive buffer model in '/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/assets/models/mobilenet_v1_fp32_224.nb' successfully
[3 11/16  2:59: 7.674 .../public/paddle-lite/lite/core/program.cc:327 RuntimeProgram] Found the attr '__@kernel_type_attr@__': feed/def/1/4/2 for feed
[5 11/16  2:59: 7.674 .../public/paddle-lite/lite/core/op_lite.cc:90 operator()] pick kernel for feed host/any/any get 1 kernels
[5 11/16  2:59: 7.674 .../public/paddle-lite/lite/core/op_lite.cc:124 CreateKernels] op feed get 1 kernels
[3 11/16  2:59: 7.674 .../public/paddle-lite/lite/core/program.cc:327 RuntimeProgram] Found the attr '__@kernel_type_attr@__': subgraph/def/18/4/1 for subgraph
[5 11/16  2:59: 7.674 .../public/paddle-lite/lite/core/op_lite.cc:90 operator()] pick kernel for subgraph nnadapter/any/NCHW get 1 kernels
[5 11/16  2:59: 7.675 .../public/paddle-lite/lite/core/op_lite.cc:90 operator()] pick kernel for subgraph nnadapter/any/any get 0 kernels
[5 11/16  2:59: 7.675 .../public/paddle-lite/lite/core/op_lite.cc:124 CreateKernels] op subgraph get 1 kernels
[3 11/16  2:59: 7.675 .../public/paddle-lite/lite/core/program.cc:327 RuntimeProgram] Found the attr '__@kernel_type_attr@__': fetch/def/1/4/2 for fetch
[5 11/16  2:59: 7.675 .../public/paddle-lite/lite/core/op_lite.cc:90 operator()] pick kernel for fetch host/any/any get 1 kernels
[5 11/16  2:59: 7.675 .../public/paddle-lite/lite/core/op_lite.cc:124 CreateKernels] op fetch get 1 kernels
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 403----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 196----------------
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 202----------------
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:60 Initialize] The NNAdapter library libnnadapter.so is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:73 Initialize] NNAdapter_getVersion is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:74 Initialize] NNAdapter_getDeviceCount is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:75 Initialize] NNAdapterDevice_acquire is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:76 Initialize] NNAdapterDevice_release is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:77 Initialize] NNAdapterDevice_getName is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:78 Initialize] NNAdapterDevice_getVendor is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:79 Initialize] NNAdapterDevice_getType is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:80 Initialize] NNAdapterDevice_getVersion is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:81 Initialize] NNAdapterContext_create is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:82 Initialize] NNAdapterContext_destroy is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:83 Initialize] NNAdapterModel_create is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:84 Initialize] NNAdapterModel_destroy is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:85 Initialize] NNAdapterModel_finish is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:86 Initialize] NNAdapterModel_addOperand is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:87 Initialize] NNAdapterModel_setOperandValue is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:88 Initialize] NNAdapterModel_getOperandType is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:89 Initialize] NNAdapterModel_addOperation is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:90 Initialize] NNAdapterModel_identifyInputsAndOutputs is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:91 Initialize] NNAdapterCompilation_create is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:92 Initialize] NNAdapterCompilation_destroy is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:93 Initialize] NNAdapterCompilation_finish is loaded.
[4 11/16  2:59: 7.693 .../backends/nnadapter/nnadapter_wrapper.cc:94 Initialize] NNAdapterCompilation_queryInputsAndOutputs is loaded.
[4 11/16  2:59: 7.694 .../backends/nnadapter/nnadapter_wrapper.cc:95 Initialize] NNAdapterExecution_create is loaded.
[4 11/16  2:59: 7.694 .../backends/nnadapter/nnadapter_wrapper.cc:96 Initialize] NNAdapterExecution_destroy is loaded.
[4 11/16  2:59: 7.694 .../backends/nnadapter/nnadapter_wrapper.cc:97 Initialize] NNAdapterExecution_setInput is loaded.
[4 11/16  2:59: 7.694 .../backends/nnadapter/nnadapter_wrapper.cc:98 Initialize] NNAdapterExecution_setOutput is loaded.
[4 11/16  2:59: 7.694 .../backends/nnadapter/nnadapter_wrapper.cc:99 Initialize] NNAdapterExecution_compute is loaded.
[4 11/16  2:59: 7.694 .../backends/nnadapter/nnadapter_wrapper.cc:101 Initialize] Extract all of symbols from libnnadapter.so done.
[5 11/16  2:59: 7.878 ...pter/driver/huawei_ascend_npu/utility.cc:36 InitializeAscendCL] Initialize AscendCL.
[3 11/16  2:59: 7.990 ...le-lite/lite/kernels/nnadapter/engine.cc:295 Engine] NNAdapter device huawei_ascend_npu: vendor=Huawei type=2 version=1
[3 11/16  2:59: 7.990 ...le-lite/lite/kernels/nnadapter/engine.cc:306 Engine] NNAdapter context_properties: 
[I 11/16  2:59: 7.990 ...apter/driver/huawei_ascend_npu/engine.cc:34 Context] properties: 
[I 11/16  2:59: 7.990 ...apter/driver/huawei_ascend_npu/engine.cc:51 Context] selected device ids: 
[I 11/16  2:59: 7.990 ...apter/driver/huawei_ascend_npu/engine.cc:53 Context] 0
[3 11/16  2:59: 7.990 ...le-lite/lite/kernels/nnadapter/engine.cc:313 Engine] NNAdapter model_cache_dir: /home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/cache
[3 11/16  2:59: 7.990 ...le-lite/lite/kernels/nnadapter/engine.cc:348 Run] NNAdapter model_cache_token: 54de7d70d80b9883fb1f8d10f70de40e
[3 11/16  2:59: 8.  6 ...le-lite/lite/kernels/nnadapter/engine.cc:351 Run] NNAdapter model_cache_buffer size: 10308356
[I 11/16  2:59: 8.262 ...adapter/nnadapter/runtime/compilation.cc:65 Compilation] Deserialize the cache models from memory success.
[I 11/16  2:59: 8.262 ...apter/driver/huawei_ascend_npu/driver.cc:66 CreateProgram] Create program for huawei_ascend_npu.
[3 11/16  2:59: 8.262 ...apter/driver/huawei_ascend_npu/engine.cc:91 Build] Model input count: 1
[3 11/16  2:59: 8.262 ...apter/driver/huawei_ascend_npu/engine.cc:94 Build] Model output count: 1
[5 11/16  2:59: 8.262 ...driver/huawei_ascend_npu/model_client.cc:29 AclModelClient] Create a ACL model client(device_id=0)
[3 11/16  2:59: 8.263 ...driver/huawei_ascend_npu/model_client.cc:33 AclModelClient] device_count: 1
[5 11/16  2:59: 8.263 ...driver/huawei_ascend_npu/model_client.cc:48 InitAclClientEnv] ACL set device(device_id_=0)
[5 11/16  2:59: 8.395 ...driver/huawei_ascend_npu/model_client.cc:50 InitAclClientEnv] ACL create context
[3 11/16  2:59: 8.551 ...driver/huawei_ascend_npu/model_client.cc:199 CreateModelIODataset] input_count: 1
[5 11/16  2:59: 8.551 ...driver/huawei_ascend_npu/model_client.cc:202 CreateModelIODataset] The buffer length of model input tensor 0:602112
[3 11/16  2:59: 8.552 ...driver/huawei_ascend_npu/model_client.cc:217 CreateModelIODataset] output_count: 1
[5 11/16  2:59: 8.552 ...driver/huawei_ascend_npu/model_client.cc:220 CreateModelIODataset] The buffer length of model output tensor 0:4000
[5 11/16  2:59: 8.552 ...driver/huawei_ascend_npu/model_client.cc:229 CreateModelIODataset] Create input and output dataset success.
[3 11/16  2:59: 8.552 ...driver/huawei_ascend_npu/model_client.cc:113 LoadModel] Load a ACL model success.
[3 11/16  2:59: 8.552 ...driver/huawei_ascend_npu/model_client.cc:156 GetModelIOTensorDim] input_count: 1
[5 11/16  2:59: 8.553 ...pter/driver/huawei_ascend_npu/utility.cc:365 ConvertACLFormatToGEFormat] geFormat: FORMAT_ND = 2
[5 11/16  2:59: 8.553 ...pter/driver/huawei_ascend_npu/utility.cc:344 ConvertACLDataTypeToGEDataType] geDataType: DT_FLOAT=0
[3 11/16  2:59: 8.553 ...driver/huawei_ascend_npu/model_client.cc:170 GetModelIOTensorDim] output_count: 1
[5 11/16  2:59: 8.553 ...pter/driver/huawei_ascend_npu/utility.cc:365 ConvertACLFormatToGEFormat] geFormat: FORMAT_ND = 2
[5 11/16  2:59: 8.553 ...pter/driver/huawei_ascend_npu/utility.cc:344 ConvertACLDataTypeToGEDataType] geDataType: DT_FLOAT=0
[5 11/16  2:59: 8.553 ...driver/huawei_ascend_npu/model_client.cc:183 GetModelIOTensorDim] Get input and output dimensions from a ACL model success.
[3 11/16  2:59: 8.553 ...apter/driver/huawei_ascend_npu/engine.cc:171 Build] CANN input tensors[0]: {1,3,224,224} 1,3,224,224
[3 11/16  2:59: 8.553 ...apter/driver/huawei_ascend_npu/engine.cc:192 Build] CANN output tensors[0]: {1,1000} 1,1000
[3 11/16  2:59: 8.553 ...apter/driver/huawei_ascend_npu/engine.cc:208 Build] Build success.
[3 11/16  2:59: 8.563 ...driver/huawei_ascend_npu/model_client.cc:343 Process] Process cost 2444 us
[5 11/16  2:59: 8.563 ...driver/huawei_ascend_npu/model_client.cc:370 Process] Process a ACL model success.
[3 11/16  2:59: 8.563 ...le-lite/lite/kernels/nnadapter/engine.cc:232 Execute] Process cost 3503 us
--------------/home/HwHiAiUser/wjp/hw_ocr/govern_test/image_classification_demo/shell/image_classification_demo.cc  Line: 205----------------
[3 11/16  2:59: 8.566 ...driver/huawei_ascend_npu/model_client.cc:343 Process] Process cost 2102 us
[5 11/16  2:59: 8.566 ...driver/huawei_ascend_npu/model_client.cc:370 Process] Process a ACL model success.
[3 11/16  2:59: 8.566 ...le-lite/lite/kernels/nnadapter/engine.cc:232 Execute] Process cost 2734 us
iter 0 cost: 2.798000 ms
[3 11/16  2:59: 8.579 ...driver/huawei_ascend_npu/model_client.cc:343 Process] Process cost 1997 us
[5 11/16  2:59: 8.579 ...driver/huawei_ascend_npu/model_client.cc:370 Process] Process a ACL model success.
[3 11/16  2:59: 8.579 ...le-lite/lite/kernels/nnadapter/engine.cc:232 Execute] Process cost 2906 us
iter 1 cost: 2.994000 ms
[3 11/16  2:59: 8.592 ...driver/huawei_ascend_npu/model_client.cc:343 Process] Process cost 2004 us
[5 11/16  2:59: 8.592 ...driver/huawei_ascend_npu/model_client.cc:370 Process] Process a ACL model success.
[3 11/16  2:59: 8.592 ...le-lite/lite/kernels/nnadapter/engine.cc:232 Execute] Process cost 2917 us
iter 2 cost: 2.995000 ms
[3 11/16  2:59: 8.605 ...driver/huawei_ascend_npu/model_client.cc:343 Process] Process cost 2653 us
[5 11/16  2:59: 8.606 ...driver/huawei_ascend_npu/model_client.cc:370 Process] Process a ACL model success.
[3 11/16  2:59: 8.606 ...le-lite/lite/kernels/nnadapter/engine.cc:232 Execute] Process cost 3519 us
iter 3 cost: 3.594000 ms
[3 11/16  2:59: 8.618 ...driver/huawei_ascend_npu/model_client.cc:343 Process] Process cost 1777 us
[5 11/16  2:59: 8.619 ...driver/huawei_ascend_npu/model_client.cc:370 Process] Process a ACL model success.
[3 11/16  2:59: 8.619 ...le-lite/lite/kernels/nnadapter/engine.cc:232 Execute] Process cost 2637 us
iter 4 cost: 2.721000 ms
warmup: 1 repeat: 5, average: 3.020400 ms, max: 3.594000 ms, min: 2.721000 ms
results: 3
Top0  tabby, tabby cat - 0.529785
Top1  Egyptian cat - 0.418945
Top2  tiger cat - 0.045227
Preprocess time: 1.720000 ms
Prediction time: 3.020400 ms
Postprocess time: 0.393000 ms

[I 11/16  2:59: 8.633 ...apter/driver/huawei_ascend_npu/driver.cc:85 DestroyProgram] Destroy program for huawei_ascend_npu.
[5 11/16  2:59: 8.633 ...driver/huawei_ascend_npu/model_client.cc:252 DestroyDataset] Destroy a ACL dataset success.
[5 11/16  2:59: 8.633 ...driver/huawei_ascend_npu/model_client.cc:252 DestroyDataset] Destroy a ACL dataset success.
[5 11/16  2:59: 8.643 ...driver/huawei_ascend_npu/model_client.cc:141 UnloadModel] Unload a ACL model success(model_id=1)
[5 11/16  2:59: 8.643 ...driver/huawei_ascend_npu/model_client.cc:55 FinalizeAclClientEnv] Destroy ACL context
[5 11/16  2:59: 8.645 ...driver/huawei_ascend_npu/model_client.cc:60 FinalizeAclClientEnv] Reset ACL device(device_id_=0)
[5 11/16  2:59: 8.649 ...driver/huawei_ascend_npu/model_client.cc:79 FinalizeAclProfilingEnv] Destroy ACL profiling config
[5 11/16  2:59: 8.657 ...pter/driver/huawei_ascend_npu/utility.cc:26 FinalizeAscendCL] Finalize AscendCL.

包含转nnc模型的日志

四.将ocr代码改成nnadapter与ascendnpu格式

通过研究demo发现,,paddleLite调用ascend加速npu的代码最重要的是在添加nnadapter_device参数,在昇腾上就应该设置为huawei_ascend_npu
在这里插入图片描述
下面的红框部分是我依据Demo改写的,但是会报不知名错误
在这里插入图片描述
暂不知道具体因为什么原因
在这里插入图片描述

五.必须设置环境变量!

必须将ascend相关环境写进环境变量,且不可再调用可执行文件的脚本里重复export环境变量,不然还会出现奇奇怪怪的错误

1.写入环境变量

vim ~/.bashrc

参考自己环境里的相关ascend-toolkit装在什么地址来改写

export LD_LIBRARY_PATH=/usr/local/Ascend/acllib/lib64
export install_path=/usr/local/Ascend/ascend-toolkit/latest
#软件包安装路径,请根据实际情况修改
export ASCEND_OPP_PATH=${install_path}/opp
export ASCEND_AICPU_PATH=${install_path}/arm64-linux    #其中{arch}请根据实际情况替换(arm64或x86_64)
export TOOLCHAIN_HOME=${install_path}/toolkit
#开发离线推理程序时配置
export LD_LIBRARY_PATH=${install_path}/acllib/lib64:$LD_LIBRARY_PATH
export PYTHONPATH=${install_path}/toolkit/python/site-packages:${install_path}/pyACL/python/site-packages/acl:$PYTHONPATH
#进行模型转换算子开发时配置
export LD_LIBRARY_PATH=${install_path}/atc/lib64:$LD_LIBRARY_PATH
export PATH=${install_path}/atc/ccec_compiler/bin:${install_path}/atc/bin:$PATH
export PYTHONPATH=${install_path}/toolkit/python/site-packages:${install_path}/atc/python/site-packages:$PYTHONPATH
# 配置python3.7.5环境变量,请根据实际路径替换
export LD_LIBRARY_PATH=/usr/local/python3.7.5/lib:/home/HwHiAiUser/aicamera/lib:$LD_LIBRARY_PATH
export PATH=/usr/local/python3.7.5/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/protobuf/lib:$LD_LIBRARY_PATH
                                                                                                      182,1         Bot

2.脚本里写进临时环境变量

# export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:.:../../libs/PaddleLite/$TARGET_OS/$TARGET_ABI/lib:../../libs/PaddleLite/$TARGET_OS/$TARGET_ABI/lib/$NNADAPTER_DEVICE_NAMES:../../libs/PaddleLite/$TARGET_OS/$TARGET_ABI/lib/cpu
    export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/HwHiAiUser/wjp/hw_ocr/govern_test/libs/PaddleLite/linux/arm64/lib
    # HUAWEI_ASCEND_TOOLKIT_HOME="/usr/local/Ascend/ascend-toolkit/5.0.2.alpha003" #"/usr/local/Ascend/ascend-toolkit/6.0.RC1.alpha003"
    NNADAPTER_CONTEXT_PROPERTIES="HUAWEI_ASCEND_NPU_SELECTED_DEVICE_IDS=0"
    # export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/Ascend/driver/lib64
    # export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/Ascend/driver/lib64/stub
    # export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HUAWEI_ASCEND_TOOLKIT_HOME/fwkacllib/lib64
    # export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HUAWEI_ASCEND_TOOLKIT_HOME/acllib/lib64
    # export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/lib64
    # export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HUAWEI_ASCEND_TOOLKIT_HOME/opp/op_proto/built-in

    # export PYTHONPATH=$PYTHONPATH:$HUAWEI_ASCEND_TOOLKIT_HOME/fwkacllib/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/acllib/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/toolkit/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/pyACL/python/site-packages/acl
    # export PATH=$PATH:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/ccec_compiler/bin:${HUAWEI_ASCEND_TOOLKIT_HOME}/acllib/bin:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/bin
    # export ASCEND_AICPU_PATH=$HUAWEI_ASCEND_TOOLKIT_HOME
    # export ASCEND_OPP_PATH=$HUAWEI_ASCEND_TOOLKIT_HOME/opp
    # export TOOLCHAIN_HOME=$HUAWEI_ASCEND_TOOLKIT_HOME/toolkit
    # export ASCEND_SLOG_PRINT_TO_STDOUT=1
    # export ASCEND_GLOBAL_LOG_LEVEL=1

在这里插入图片描述
可执行程序在运行的时候需要调用相应动态库才能运行,所以需要写入临时环境变量。

五.部分模型不能成功转成nnc格式加速的问题

然后我想到了一个新方法,直接用能成功调用npu加速的Demo来直接将这些ocr模型转换成加密nnc格式

  • ascend文档
    https://www.paddlepaddle.org.cn/lite/develop/demo_guides/huawei_ascend_npu.html#npu
    在这里插入图片描述
    然后发现在文档种写的支持的上列模型,det模型能成功转换,而rec模型并不能成功转换
    请添加图片描述
    请添加图片描述
    请添加图片描述
    上面三张截图是ch-ppocr-mobile v2.O_rec,0 ch ppocr_server v2 .0rec,0 ch PP-OCRv2 rec的转换结果,可以看到都转换失败。
    在这里插入图片描述
    det模型转换成功

说明抛开其他因素,一些模型本身也是无法转换的

六.考虑其他办法

待续

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mfbz.cn/a/226776.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

选择更好的Notes索引附件方式

大家好,才是真的好。 首先介绍最近产品更新消息。在上一周,HCL主要发布了以下几个产品更新:HCL Verse 3.2.0、HCL Volt MX Go 2.0.2、HCL Domino Rest API 1.0.8。 HCL Verse是今后Domino的产品当中主要使用的webmail功能,这一次…

Linux Component概述和高通V4l2驱动模型

1 Linux为什么要引入Component框架? 为了让subsystem按照一定顺序初始化设备才提出来的。 subsystem中由很多设备模块,内核加载这些模块的时间不确定。子系统内有些模块是需要依赖其它模块先初始化才能进行自己初始化工作(例如v4l2 subdev和v4l2 video …

弱网模拟工具

一、背景 一个人晚上在家通过 Wi-Fi 上网,在线电影播放基本流畅,可一旦在晚间用网高峰期打视频电话就画面糊,这时不仅可能带宽受限了,还可能有较高的丢包率。与有线网络通信相比,无线网络通信受环境影响会更大&#x…

Unity 关于Ray、RaycastHit、Raycast及其使用

Unity中,我们要进行物理模拟和碰撞检测时,有三个重要的概念Ray、RaycastHit、Raycast。 其中,Ray可以理解为射线,它是一条从起点沿着特定方向延伸的无限长线段。 它的语法是: Ray(Vector3 origin, Vector3 directio…

js/jQuery常见操作 之 jQuery操作复选框的常见问题

js/jQuery常见操作 之 jQuery操作复选框的常见问题 1. js/jQuery的其他一些常见基础操作2. 全选/全不选问题2.1 效果2.2 实现代码2.2.1 简单js实现2.2.2 jQuery实现2.2.2.1 注意语法(区别jQuery版本)2.2.2.2 完整代码实现 3. jQuery实现点击 行tr 实现ch…

SCADA软件工具有多少免费的?

随着工业自动化的飞速发展,SCADA系统已经成为工业领域智能化转型绕不开的重要工具,不少个人和公司也都加入到了学习研究SCADA系统的队伍中。数维图小编耗费大量时间整理了国内外免费(非完全免费)的SCADA软件工具,有部分…

uniapp 之 短信验证码登录

一、需求 输入手机号码&#xff0c;可以获取验证码。 二、实现效果 点击前&#xff1a; 点击后&#xff1a; 三、代码实现 <template><view class"login"><view class"infobox"><view class"item"><input type…

搜索推荐技术-爱奇艺搜索引擎技术

一、爱奇艺的搜索引擎框架示意图 即通过召回系统&#xff0c;即基于文本匹配的matching system&#xff0c;得到大量视频资源的候选集&#xff0c;经过粗排和精排&#xff0c;最后返回给用户。重点在于召回模块和排序模块。 二、召回模块 召回模块比较重要的是基础相关性&am…

vue3 + mark.js 实现文字标注功能

效果图 安装依赖 npm install mark.js --save-dev npm i nanoid代码块 <template><!-- 文档标注 --><header><el-buttontype"primary":disabled"selectedTextList.length 0 ? true : false"ghostclick"handleAllDelete"…

MySQL数据库,函数与分组

单行函数&#xff1a; 操作数据对象 接受参数返回一个结果 只对一行进行变换 每行返回一个结果 可以嵌套 参数也可以是一列或一个值 数值函数 基本函数&#xff1a; 注&#xff1a;ROUND(x,y)函数的y是负数时&#xff0c;即往高位进行四舍五入&#xff0c;如-3就是按百位…

机器学习 类别特征编码:Category Encoders 库的使用

✅作者简介&#xff1a;人工智能专业本科在读&#xff0c;喜欢计算机与编程&#xff0c;写博客记录自己的学习历程。 &#x1f34e;个人主页&#xff1a;小嗷犬的个人主页 &#x1f34a;个人网站&#xff1a;小嗷犬的技术小站 &#x1f96d;个人信条&#xff1a;为天地立心&…

优雅提效:Guava的字符串处理工具

第1章&#xff1a;引言 大家好&#xff0c;我是小黑&#xff0c;今天咱们要聊一聊Google Guava这个超棒的Java库&#xff0c;尤其是它的字符串处理工具。对于Java程序员来说&#xff0c;字符串处理是日常工作的一部分&#xff0c;而Guava在这方面提供了非常强大的支持。使用Gu…

12.7_黑马数据结构与算法Java

030 单向链表 get 不会提前拿到所有的索引值&#xff0c;这样维护起来非常不方便。因此&#xff0c;我们是在遍历的时候得到他的索引值 %d&#xff1a;数字的占位符就是用百分号d表示 %n&#xff1a;换行符 thinking:String.format()? String.format()的详细用法_七月J的博客-…

大数据在互联网营销中的应用:案例与策略

互联网时代的营销领域正经历着一场由大数据驱动的变革。在2023年&#xff0c;大数据的应用已成为推动市场策略和决策的关键因素。本文将探讨大数据如何影响互联网营销&#xff0c;并通过实际案例分析展示其在提升营销效果方面的作用。 首先&#xff0c;通过分析海量数据&#x…

如何配置Modbus转Profinet网关与6台232 Modbus伺服通信

Modbus转Profinet网关连接6台232 Modbus伺服同时控制的解决方案不仅简化了设备连接&#xff0c;还减少了人力投入和维护成本。通过Modbus转Profinet网关&#xff0c;操作人员只需对Profinet网关进行设置和监控&#xff0c;即可实现对6台232 Modbus伺服的集中控制。 Modbus转Pro…

中山大学李华山、王彪课题组开发 SEN 机器学习模型,高精度预测材料性能

内容一览&#xff1a;了解全局晶体对称性并分析等变信息&#xff0c;对于预测材料性能至关重要&#xff0c;但现有的、基于卷积网络的算法尚且无法完全实现这些需求。针对于此&#xff0c;中山大学的李华山、王彪课题组&#xff0c;开发了一款名为 SEN 的机器学习模型&#xff…

史上最强 Charles 抓包

&#x1f4e2;专注于分享软件测试干货内容&#xff0c;欢迎点赞 &#x1f44d; 收藏 ⭐留言 &#x1f4dd; 如有错误敬请指正&#xff01;&#x1f4e2;交流讨论&#xff1a;欢迎加入我们一起学习&#xff01;&#x1f4e2;资源分享&#xff1a;耗时200小时精选的「软件测试」资…

Python中如何判断List中是否包含某个元素

更多资料获取 &#x1f4da; 个人网站&#xff1a;ipengtao.com 在Python中&#xff0c;判断一个列表&#xff08;List&#xff09;是否包含某个特定元素是常见的任务之一。在本文中&#xff0c;将深入探讨多种判断List成员包含性的方法&#xff0c;并提供丰富的示例代码&…

嵌入式工程师校招经验与学习路线总结

前言&#xff1a;不知不觉2023年秋招已经结束&#xff0c;作者本人侥幸于秋招中斩获数十份大差不差的OFFER&#xff0c;包含&#xff1a;Top级的AIGC&#xff0c;工控龙头&#xff0c;国产MCU原厂&#xff0c;医疗器械&#xff0c;新能源车企等。总而言之&#xff0c;秋招总体情…

量子纠缠通讯:未来信息技术的革新者

量子纠缠通讯:未来信息技术的革新者 引言 在科技日新月异的今天,我们正逐步走进一个全新的科技时代——量子时代。量子纠缠通讯作为量子技术的重要分支,以其独特的优势和巨大的潜力,成为了当今世界研究的热点。本文将带您深入探讨量子纠缠通讯的奥秘,了解其原理、优势、…