Agriculture Robots using Deep Learning
Prashanth M V1, Nida Susan A K2, Sanjana M S3
1Prof. Prashanth M V*, Dept. of Information Science and Engineering, Vidyavardhaka College of Engineering, VVCE, India.
2Nida Susan A K, Dept. of Information Science and Engineering, Vidyavardhaka College of Engineering, VVCE, India.
3Sanjana M S, Dept. of Information Science and Engineering, Vidyavardhaka College of Engineering, VVCE, India.
Manuscript received on March 16, 2020. | Revised Manuscript received on March 24, 2020. | Manuscript published on March 30, 2020. | PP: 2266-2271 | Volume-8 Issue-6, March 2020. | Retrieval Number: F7823038620/2020©BEIESP | DOI: 10.35940/ijrte.F7823.038620
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Abstract: Agriculture is very important to the continuation of mankind, which maintains a driving factor for many world economies, especially in stunted and emerging countries. The bread and food crop mandate is increasing due to the globe’s expanding population and the challenges posed by environmental issues, while reducing costs. Agricultural researchers often use software systems without an appropriate analysis of the ideas and mechanisms of a technique. Intelligence is seen as the greatest challenge for nurturing the productive capacity and accurate performance output. Throughout this paper we aim to understand the key techniques of using robots in agriculture. Deep learning was extensively studied and implemented in multiple fields of recent years, along with agriculture. Robot systems provide a stable, price-effective, flexible and modular product development system and deliver predictive results.
Keywords: Deep Learning, Classification, Neural Networks, Pattern Recognition, Multi-Robot Systems, Autonomous Agricultural Robots, Identification Of Anomalies
Scope of the Article: Deep Learning.