Custem yolov5 model yolov5n - Segmentation fault (core dumped)

  • Please let me know if anything can be done i have tried all the things.

    When i tried to convert my model but then it says dill not found then i needed to install this library and then it converted to onnx

    Code: Select all

    (yolov5) ml@ml-ts:~/yolov5$ pip show dill
    Name: dill
    Version: 0.3.8
    Summary: serialize all of Python
    Home-page: https://github.com/uqfoundation/dill
    Author: Mike McKerns
    Author-email: mmckerns@uqfoundation.org
    License: BSD-3-Clause
    Location: /home/ml/miniconda3/envs/yolov5/lib/python3.9/site-packages
    Requires: 
    Required-by:
    Only when i try to convert the default models of object detection yolov5 they works fine and even give error but when i installed library still they works but custom model conversion does not work

    Code: Select all

    (yolov5) ml@ml-ts:~/yolov5$ python export.py --rknpu --weight sdel.pt 
    export: data=data/coco128.yaml, weights=['sdel.pt'], imgsz=[640, 640], batch_size=1, device=cpu, half=False, inplace=False, keras=False, optimize=False, int8=False, dynamic=False, simplify=False, opset=12, verbose=False, workspace=4, nms=False, agnostic_nms=False, topk_per_class=100, topk_all=100, iou_thres=0.45, conf_thres=0.25, include=['onnx'], rknpu=True
    YOLOv5 🚀 v4.0-1657-gd25a075 Python-3.9.12 torch-2.3.1+cu121 CPU
    
    Traceback (most recent call last):
      File "/home/ml/yolov5/export.py", line 723, in <module>
        main(opt)
      File "/home/ml/yolov5/export.py", line 717, in main
        run(**vars(opt))
      File "/home/ml/miniconda3/envs/yolov5/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "/home/ml/yolov5/export.py", line 533, in run
        model = attempt_load(weights, device=device, inplace=True, fuse=True)  # load FP32 model
      File "/home/ml/yolov5/models/experimental.py", line 80, in attempt_load
        ckpt = torch.load(attempt_download(w), map_location='cpu')  # load
      File "/home/ml/miniconda3/envs/yolov5/lib/python3.9/site-packages/torch/serialization.py", line 1025, in load
        return _load(opened_zipfile,
      File "/home/ml/miniconda3/envs/yolov5/lib/python3.9/site-packages/torch/serialization.py", line 1446, in _load
        result = unpickler.load()
      File "/home/ml/miniconda3/envs/yolov5/lib/python3.9/site-packages/torch/serialization.py", line 1439, in find_class
        return super().find_class(mod_name, name)
    ModuleNotFoundError: No module named 'dill'
    
    Last edited by techhitesh on 2024-07-25 16:50, edited 2 times in total.
  • Hello,Due to the differences in development environments and actual needs, it is challenging to pinpoint the exact issues with custom model inference. In the future, we plan to develop for custom model inference to establish a universal development environment.

    Currently, a common issue is the incorrect configuration of the recognition categories and the file describing the categories, coco_80_labels_list.txt.