Please use this identifier to cite or link to this item:
http://ir.futminna.edu.ng:8080/jspui/handle/123456789/20050
Title: | PERFORMANCE EVALUATION OF SELECTED DEEP LEARNING ALGORITHMS IN AUTOMATIC WEED DETECTION SYSTEM. |
Authors: | Ashi, John |
Issue Date: | 11-Jun-2023 |
Abstract: | ABSTRACT Site-specific weed detection and management in agrarian lands is a crucial approach for crop productivity management and chemical contamination mitigation in precision agriculture. Traditional ways of executing this operation is expensive and labour intensive, as well as exposing personnel to the danger of exposure to hazardous chemicals. To create a more sustainable agricultural system, a program for automatically detecting agricultural weeds in a mixed farmland using the Faster RCNN inception v2 model and YOLOv5s neural network, was proposed. With the introduction of Unmanned Aerial Vehicles (UAV) and technological advancements in Deep Learning techniques in recent years, it has become possible to identify and classify weeds from crops at desired spatial and temporal resolution. A DJI Phantom 4 UAV was used to simultaneously collect about 254 image pairs of a mixed-crop farmland. The proposed approach for Faster RCNN involves labelling or annotating the images before uploading the dataset into an online Graphic Processing Unit (GPU) known as Google Colaboratory (Colab) which runs on a Python programming language, where the dataset were trained over five epochs (10,000, 20,000, 100,000, 200,000, and 242,000) to get the maximum epoch where the model flattens out using Python programming codes and tested on the testing dataset for the automatic identification and classification of weeds. Also, the YOLO v5 neural network was trained over 100, 300, 500, 600, 700 and 1000 epochs and this was also implemented on Colab using python programming language. Both neural network algorithms identified and classified five classes which are as follows: sugarcane, spinach, banana, pepper and weeds. The utilized classifiers' overall classification accuracy differed widely. Faster RCNN exhibited the highest overall accuracies. Notably lower accuracies were observed using YOLOv5. The lowest accuracies were achieved at 10,000 epochs with an overall accuracy of 52%, weed precision of 50%, and weed recall of 8%, while the highest level of accuracies and saturation point were achieved at 200,000 epochs with 98% overall accuracy, 98% weed precision, and 99% weed recall. The minimum epoch of YOLOv5s classification at 100 epochs achieved the overall accuracy of 16 %, weed precision of 5 % and 1% for the weed recall. Furthermore, the classifier achieved a maximum weed precision at 600 epochs with a weed precision of 78 %, weed recall of 34 % and an overall accuracy of 67 %. With only 16 % and 66% overall accuracy of YOLOv5s, the Faster RCNN Deep Learning exhibited a better classification output, making it a better classifier suitable for automatic weed identification and classification, and it is thus recommended. Further research should be carried out to further compare the performance of Faster RCNN inception v2 model with a few other recent powerful Deep Learning algorithms to increase or strengthen weed detection on small farmlands. Also, images should be taken at a flying height less than 30m and closer for smaller weeds so they appear larger in the image. |
URI: | http://repository.futminna.edu.ng:8080/jspui/handle/123456789/20050 |
Appears in Collections: | General Studies Unit |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ASHI JOHN'S THESIS (1).pdf | 6.31 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.