We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我安装的环境如下: paddle-bfloat 0.1.7 paddle-serving-app 0.9.0 paddle-serving-client 0.9.0 paddle-serving-server-gpu 0.9.0.post112 paddle2onnx 1.0.5 paddlefsl 1.1.0 paddlehub 2.3.1 paddlenlp 2.4.9 paddlepaddle-gpu 2.3.2.post116 paddleslim 2.2.1 paddlex 2.1.0 我的问题是: paddleServing 部署 picodet模型 ; 目前我训练结束后, 转成inference model 来推理时没问题的, 转成paddleServing方式后, 加载报错了, 错误信息如下; feed_dict的输出维度以及信息不了解是啥意思, 后处理报错;
我的config.yml如下:
#uci模型路径
model_config: "serving_server"
#计算硬件类型: 空缺时由devices决定(CPU/GPU),0=cpu, 1=gpu, 2=tensorRT, 3=arm cpu, 4=kunlun xpu
device_type: 1
#计算硬件ID,当devices为""或不写时为CPU预测;当devices为"0", "0,1,2"时为GPU预测,表示使用的GPU卡
devices: "0" # "0,1"
#client类型,包括brpc, grpc和local_predictor.local_predictor不启动Serving服务,进程内预测
client_type: local_predictor
#Fetch结果列表,以client_config中fetch_var的alias_name为准
fetch_list: ['transpose_0.tmp_0', 'transpose_1.tmp_0', 'transpose_2.tmp_0', 'transpose_3.tmp_0', 'transpose_4.tmp_0', 'transpose_5.tmp_0', 'transpose_6.tmp_0', 'transpose_7.tmp_0']
我的 serving_server_conf.prototxt 如下: 我锁定的错误信息位置:
感觉是 由于模型转化后的 feed_dict输出不对, 我输出出来的不是 bbox_num 和bbox的值, 请老师们看下是哪里的问题, 帮忙解决下
The text was updated successfully, but these errors were encountered:
No branches or pull requests
我安装的环境如下:
paddle-bfloat 0.1.7
paddle-serving-app 0.9.0
paddle-serving-client 0.9.0
paddle-serving-server-gpu 0.9.0.post112
paddle2onnx 1.0.5
paddlefsl 1.1.0
paddlehub 2.3.1
paddlenlp 2.4.9
paddlepaddle-gpu 2.3.2.post116
paddleslim 2.2.1
paddlex 2.1.0
我的问题是:
paddleServing 部署 picodet模型 ; 目前我训练结束后, 转成inference model 来推理时没问题的, 转成paddleServing方式后, 加载报错了, 错误信息如下; feed_dict的输出维度以及信息不了解是啥意思, 后处理报错;
我的config.yml如下:
我的 serving_server_conf.prototxt 如下:
我锁定的错误信息位置:
感觉是 由于模型转化后的 feed_dict输出不对, 我输出出来的不是 bbox_num 和bbox的值, 请老师们看下是哪里的问题, 帮忙解决下
The text was updated successfully, but these errors were encountered: