Augmenting Crop Detection for Precision Agriculture with Deep Visual Transfer Learning—A Case Study of Bale Detection. Zhao, W., Yamada, W., Li, T., Digman, M., & Runge, T. Remote Sensing, 13(1):23, Multidisciplinary Digital Publishing Institute, 12, 2021. Paper Website doi abstract bibtex In recent years, precision agriculture has been researched to increase crop production with less inputs, as a promising means to meet the growing demand of agriculture products. Computer vision-based crop detection with unmanned aerial vehicle (UAV)-acquired images is a critical tool for precision agriculture. However, object detection using deep learning algorithms rely on a significant amount of manually prelabeled training datasets as ground truths. Field object detection, such as bales, is especially difficult because of (1) long-period image acquisitions under different illumination conditions and seasons; (2) limited existing prelabeled data; and (3) few pretrained models and research as references. This work increases the bale detection accuracy based on limited data collection and labeling, by building an innovative algorithms pipeline. First, an object detection model is trained using 243 images captured with good illimitation conditions in fall from the crop lands. In addition, domain adaptation (DA), a kind of transfer learning, is applied for synthesizing the training data under diverse environmental conditions with automatic labels. Finally, the object detection model is optimized with the synthesized datasets. The case study shows the proposed method improves the bale detecting performance, including the recall, mean average precision (mAP), and F measure (F1 score), from averages of 0.59, 0.7, and 0.7 (the object detection) to averages of 0.93, 0.94, and 0.89 (the object detection + DA), respectively. This approach could be easily scaled to many other crop field objects and will significantly contribute to precision agriculture.
@article{
title = {Augmenting Crop Detection for Precision Agriculture with Deep Visual Transfer Learning—A Case Study of Bale Detection},
type = {article},
year = {2021},
keywords = {adverse weather conditions,computer vision,domain adaptation,generative adversarial network (GAN),illimitation change},
pages = {23},
volume = {13},
websites = {https://www.mdpi.com/2072-4292/13/1/23/htm,https://www.mdpi.com/2072-4292/13/1/23},
month = {12},
publisher = {Multidisciplinary Digital Publishing Institute},
day = {23},
id = {8c8c347f-efe8-3599-8f3d-5da0c5ac4dfc},
created = {2023-01-06T11:08:24.474Z},
accessed = {2023-01-06},
file_attached = {true},
profile_id = {235249c2-3ed4-314a-b309-b1ea0330f5d9},
group_id = {5ec9cc91-a5d6-3de5-82f3-3ef3d98a89c1},
last_modified = {2023-01-13T07:12:08.385Z},
read = {false},
starred = {false},
authored = {false},
confirmed = {false},
hidden = {false},
folder_uuids = {bd3c6f2e-3514-47cf-bc42-12db8b9abe45},
private_publication = {false},
abstract = {In recent years, precision agriculture has been researched to increase crop production with less inputs, as a promising means to meet the growing demand of agriculture products. Computer vision-based crop detection with unmanned aerial vehicle (UAV)-acquired images is a critical tool for precision agriculture. However, object detection using deep learning algorithms rely on a significant amount of manually prelabeled training datasets as ground truths. Field object detection, such as bales, is especially difficult because of (1) long-period image acquisitions under different illumination conditions and seasons; (2) limited existing prelabeled data; and (3) few pretrained models and research as references. This work increases the bale detection accuracy based on limited data collection and labeling, by building an innovative algorithms pipeline. First, an object detection model is trained using 243 images captured with good illimitation conditions in fall from the crop lands. In addition, domain adaptation (DA), a kind of transfer learning, is applied for synthesizing the training data under diverse environmental conditions with automatic labels. Finally, the object detection model is optimized with the synthesized datasets. The case study shows the proposed method improves the bale detecting performance, including the recall, mean average precision (mAP), and F measure (F1 score), from averages of 0.59, 0.7, and 0.7 (the object detection) to averages of 0.93, 0.94, and 0.89 (the object detection + DA), respectively. This approach could be easily scaled to many other crop field objects and will significantly contribute to precision agriculture.},
bibtype = {article},
author = {Zhao, Wei and Yamada, William and Li, Tianxin and Digman, Matthew and Runge, Troy},
doi = {10.3390/RS13010023},
journal = {Remote Sensing},
number = {1}
}
Downloads: 0
{"_id":"ep7JWAc3WZxjoRWnr","bibbaseid":"zhao-yamada-li-digman-runge-augmentingcropdetectionforprecisionagriculturewithdeepvisualtransferlearningacasestudyofbaledetection-2021","author_short":["Zhao, W.","Yamada, W.","Li, T.","Digman, M.","Runge, T."],"bibdata":{"title":"Augmenting Crop Detection for Precision Agriculture with Deep Visual Transfer Learning—A Case Study of Bale Detection","type":"article","year":"2021","keywords":"adverse weather conditions,computer vision,domain adaptation,generative adversarial network (GAN),illimitation change","pages":"23","volume":"13","websites":"https://www.mdpi.com/2072-4292/13/1/23/htm,https://www.mdpi.com/2072-4292/13/1/23","month":"12","publisher":"Multidisciplinary Digital Publishing Institute","day":"23","id":"8c8c347f-efe8-3599-8f3d-5da0c5ac4dfc","created":"2023-01-06T11:08:24.474Z","accessed":"2023-01-06","file_attached":"true","profile_id":"235249c2-3ed4-314a-b309-b1ea0330f5d9","group_id":"5ec9cc91-a5d6-3de5-82f3-3ef3d98a89c1","last_modified":"2023-01-13T07:12:08.385Z","read":false,"starred":false,"authored":false,"confirmed":false,"hidden":false,"folder_uuids":"bd3c6f2e-3514-47cf-bc42-12db8b9abe45","private_publication":false,"abstract":"In recent years, precision agriculture has been researched to increase crop production with less inputs, as a promising means to meet the growing demand of agriculture products. Computer vision-based crop detection with unmanned aerial vehicle (UAV)-acquired images is a critical tool for precision agriculture. However, object detection using deep learning algorithms rely on a significant amount of manually prelabeled training datasets as ground truths. Field object detection, such as bales, is especially difficult because of (1) long-period image acquisitions under different illumination conditions and seasons; (2) limited existing prelabeled data; and (3) few pretrained models and research as references. This work increases the bale detection accuracy based on limited data collection and labeling, by building an innovative algorithms pipeline. First, an object detection model is trained using 243 images captured with good illimitation conditions in fall from the crop lands. In addition, domain adaptation (DA), a kind of transfer learning, is applied for synthesizing the training data under diverse environmental conditions with automatic labels. Finally, the object detection model is optimized with the synthesized datasets. The case study shows the proposed method improves the bale detecting performance, including the recall, mean average precision (mAP), and F measure (F1 score), from averages of 0.59, 0.7, and 0.7 (the object detection) to averages of 0.93, 0.94, and 0.89 (the object detection + DA), respectively. This approach could be easily scaled to many other crop field objects and will significantly contribute to precision agriculture.","bibtype":"article","author":"Zhao, Wei and Yamada, William and Li, Tianxin and Digman, Matthew and Runge, Troy","doi":"10.3390/RS13010023","journal":"Remote Sensing","number":"1","bibtex":"@article{\n title = {Augmenting Crop Detection for Precision Agriculture with Deep Visual Transfer Learning—A Case Study of Bale Detection},\n type = {article},\n year = {2021},\n keywords = {adverse weather conditions,computer vision,domain adaptation,generative adversarial network (GAN),illimitation change},\n pages = {23},\n volume = {13},\n websites = {https://www.mdpi.com/2072-4292/13/1/23/htm,https://www.mdpi.com/2072-4292/13/1/23},\n month = {12},\n publisher = {Multidisciplinary Digital Publishing Institute},\n day = {23},\n id = {8c8c347f-efe8-3599-8f3d-5da0c5ac4dfc},\n created = {2023-01-06T11:08:24.474Z},\n accessed = {2023-01-06},\n file_attached = {true},\n profile_id = {235249c2-3ed4-314a-b309-b1ea0330f5d9},\n group_id = {5ec9cc91-a5d6-3de5-82f3-3ef3d98a89c1},\n last_modified = {2023-01-13T07:12:08.385Z},\n read = {false},\n starred = {false},\n authored = {false},\n confirmed = {false},\n hidden = {false},\n folder_uuids = {bd3c6f2e-3514-47cf-bc42-12db8b9abe45},\n private_publication = {false},\n abstract = {In recent years, precision agriculture has been researched to increase crop production with less inputs, as a promising means to meet the growing demand of agriculture products. Computer vision-based crop detection with unmanned aerial vehicle (UAV)-acquired images is a critical tool for precision agriculture. However, object detection using deep learning algorithms rely on a significant amount of manually prelabeled training datasets as ground truths. Field object detection, such as bales, is especially difficult because of (1) long-period image acquisitions under different illumination conditions and seasons; (2) limited existing prelabeled data; and (3) few pretrained models and research as references. This work increases the bale detection accuracy based on limited data collection and labeling, by building an innovative algorithms pipeline. First, an object detection model is trained using 243 images captured with good illimitation conditions in fall from the crop lands. In addition, domain adaptation (DA), a kind of transfer learning, is applied for synthesizing the training data under diverse environmental conditions with automatic labels. Finally, the object detection model is optimized with the synthesized datasets. The case study shows the proposed method improves the bale detecting performance, including the recall, mean average precision (mAP), and F measure (F1 score), from averages of 0.59, 0.7, and 0.7 (the object detection) to averages of 0.93, 0.94, and 0.89 (the object detection + DA), respectively. This approach could be easily scaled to many other crop field objects and will significantly contribute to precision agriculture.},\n bibtype = {article},\n author = {Zhao, Wei and Yamada, William and Li, Tianxin and Digman, Matthew and Runge, Troy},\n doi = {10.3390/RS13010023},\n journal = {Remote Sensing},\n number = {1}\n}","author_short":["Zhao, W.","Yamada, W.","Li, T.","Digman, M.","Runge, T."],"urls":{"Paper":"https://bibbase.org/service/mendeley/bfbbf840-4c42-3914-a463-19024f50b30c/file/30c15674-9973-2667-5df2-0f0dccd0444e/full_text.pdf.pdf","Website":"https://www.mdpi.com/2072-4292/13/1/23/htm,https://www.mdpi.com/2072-4292/13/1/23"},"biburl":"https://bibbase.org/service/mendeley/bfbbf840-4c42-3914-a463-19024f50b30c","bibbaseid":"zhao-yamada-li-digman-runge-augmentingcropdetectionforprecisionagriculturewithdeepvisualtransferlearningacasestudyofbaledetection-2021","role":"author","keyword":["adverse weather conditions","computer vision","domain adaptation","generative adversarial network (GAN)","illimitation change"],"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"article","biburl":"https://bibbase.org/service/mendeley/bfbbf840-4c42-3914-a463-19024f50b30c","dataSources":["2252seNhipfTmjEBQ"],"keywords":["adverse weather conditions","computer vision","domain adaptation","generative adversarial network (gan)","illimitation change"],"search_terms":["augmenting","crop","detection","precision","agriculture","deep","visual","transfer","learning","case","study","bale","detection","zhao","yamada","li","digman","runge"],"title":"Augmenting Crop Detection for Precision Agriculture with Deep Visual Transfer Learning—A Case Study of Bale Detection","year":2021}