# linkta-yolov8-train **Repository Path**: flamebox/linkta-yolov8-train ## Basic Information - **Project Name**: linkta-yolov8-train - **Description**: No description available - **Primary Language**: Python - **License**: MulanPSL-2.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-08-09 - **Last Updated**: 2024-08-09 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README ## 1. 构建环境 确认当前为root账号,执行 ``` chmod +x ./build_env.sh && ./build_env.sh ``` 如果在安装torch时网速太慢,可以停止自动构建脚本,手动下载以下两个文件 ``` https://download.pytorch.org/whl/cu118/torch-2.1.2%2Bcu118-cp310-cp310-linux_x86_64.whl https://download.pytorch.org/whl/cu118/torchvision-0.16.2%2Bcu118-cp310-cp310-linux_x86_64.whl ``` 再次进入虚拟环境,顺序执行以下命令,手工完成环境安装 ``` source .yolov8_venv/bin/activate pip install -r torch-2.1.2+cu118-cp310-cp310-linux_x86_64.whl pip install -r torchvision-0.16.2+cu118-cp310-cp310-linux_x86_64.whl pip install openvino==2024.0.0 pip install nncf pip install ultralytics==8.0.170 pip install opencv-python-headless cp files/Arial.ttf /root/.config/Ultralytics/ ``` ## 2. 训练和导出模型 - 终端输入以下命令,进入虚拟环境 ```c source .yolov8_venv/bin/activate ``` - 修改train.yaml,指定训练数据集路径和分类名称。数据集路径需要使用**绝对路径** ```sh # Train/val/test sets as 1) dir: path/to/imgs, 2) file: path/to/imgs.txt, or 3) list: [path/to/imgs1, path/to/imgs2, ..] path: /train_data/pig # dataset root dir train: train/images # train images (relative to 'path') 118287 images val: valid/images # val images (relative to 'path') 5000 images test: test-dev2017.txt # 20288 of 40670 images, submit to https://competitions.codalab.org/competitions/20794 # Classes names: 0: pig ``` - 开始训练 ``` # 避免多卡训练报错 export MKL_SERVICE_FORCE_INTEL=1 yolo detect train data=train.yaml model=yolov8s.yaml epochs=300 batch=24 imgsz=640 device=0 ``` - 导出模型 将训练好的`best.pt`文件复制到`my_models` ``` cd my_models python yolov8_openvino_int8.py best.pt ../train.yaml ``` - 量化脚本会转换模型、对比评分、对比测试推理速度,最后的结果模型会保存在`my_models/xxx_openvino_model`中。 `xxx_quantized.xml`和`xxx_quantized.bin`就是给天露部署的模型