does luckfox rv1106 support tensorflow lite?
-
i just bought luckfox rv1106 yesterday, I just want to know if luckfox rv1106 supports using tensorflow lite or lite micro?
It seems like currently only Mobilenetv2 and YoloV5 well supported on this board, which aren't bad models.
For faster inference mobilenet, for more accurate inference Yolov5.
For faster inference mobilenet, for more accurate inference Yolov5.
how to use Mobbilenetv2 on luckfox?grayfacenospace wrote: ↑2024-01-23 18:31 It seems like currently only Mobilenetv2 and YoloV5 well supported on this board, which aren't bad models.
For faster inference mobilenet, for more accurate inference Yolov5.
Tensorflow Lite should be supported using the RKNN Toolkit2
https://github.com/rockchip-linux/rknn-toolkit2
https://github.com/rockchip-linux/rknn-toolkit2
CPP examples are here https://github.com/rockchip-linux/rknpu ... 106_RV1103
https://github.com/airockchip/rknn_model_zoo
"RKNN Model Zoo is developed based on the RKNPU SDK toolchain and provides deployment examples for current mainstream algorithms. Include the process of exporting the RKNN model and using Python API and CAPI to infer the RKNN model.
Support RK3562, RK3566, RK3568, RK3588 platforms. (RV1103, RV1106 platforms support mobilenet, yolov5)
According to the latest repo only mobilenet and yolov5 are compatible right now. Those are already rather a bit complicated to use, I doubt tflite is easy to get workingtortuga wrote: ↑2024-01-26 9:48 Tensorflow Lite should be supported using the RKNN Toolkit2
https://github.com/rockchip-linux/rknn-toolkit2
https://github.com/airockchip/rknn_model_zoo
"RKNN Model Zoo is developed based on the RKNPU SDK toolchain and provides deployment examples for current mainstream algorithms. Include the process of exporting the RKNN model and using Python API and CAPI to infer the RKNN model.
Support RK3562, RK3566, RK3568, RK3588 platforms. (RV1103, RV1106 platforms support mobilenet, yolov5)