Sorry, you need to enable JavaScript to visit this website.

Wearable Technology Tools

07:24AM Jan 03, 2015

Going hands-free on kernel counts and image recognition

A farmer walks into a field, grabs an ear of corn and takes a picture with Google Glass. Within seconds he knows the precise kernel count—fast, seamless and hands-free. 

IntelliScout, from Farmhouse Networks, the new agricultural arm of Basecamp Networks, is an app that works in tandem with Google Glass and serves as a buildable platform.  

In 2012, Craig Ganssle, CEO of Basecamp Networks, was selected for the Google Glass Explorer Program. When he got his first pair, Ganssle believed Glass would be most beneficial in a hands-free work environment. “The hands-free functionality sparked interest in farming, and I could see great advantages for Glass in multiple aspects of farming,” he says. “One that really stood out was crop scouting.”

The IntelliScout application, in conjunction with Glass, is connected via Wi-Fi or a smartphone. It records pictures, video and dictation—all hands-free. Kernel counting, for example, is a quick process. “Take a picture in the display, and it goes into the Cloud, goes through an algorithm and comes back in less than two seconds to indicate how many kernels are on that ear of corn,” Ganssle describes.

A dashboard console categorizes the data from IntelliScout field reports. Location is automatically geotagged each time a picture is taken. The data automatically populates in the dashboard and maps coordinates inside the field. The field reports automatically pull weather data for temperature, wind speed, humidity, precipitation, soil moisture and more when field reports are sent to the Cloud.

Map data can be interlayed into the platform to bring even more precise locators. “Field, plot and row—all of those precise locations are simple every time you take a field report.”

Currently, Ganssle is working on image recognition and phenotyping to enable IntelliScout to identify insects or plant disease by matching photos in a database. “We’re connecting with some large agriculture companies to get access to hundreds of thousands of images taken over decades of archiving,” Ganssle notes. 

Ganssle has partnered for further research with Kansas State University and Ignacio Ciampitti, assistant professor in crop production and cropping systems. “ IntelliScout could have particularly great potential for estimating yield and getting grain numbers,” Ciampitti says. “At the moment, we send people into the field to get yield estimations before harvest, and counting rows is a tedious task. An automatic count will save major time.” 

IntelliScout presently works with Google Glass but will also work with other wearable devices as they come to market. 

“When farmers or consultants go into in a field, they want to try several things at once, and wearable technology could let them take multiple tools into the field,” Ciampitti notes.

Wearable technologies offer a range of potential applications for crop production. In addition to kernel counting, Ganssle is developing cotton node counting and grape counting functions for IntelliScout. Ganssle wants to focus on key facets of individual crops for scouting and identification.  

“We don’t want to dictate what IntelliScout will do. We want to collaborate with farmers, consultants and agriculture companies and find out how we can have the greatest impact on farming by learning from them.”