Example 1. Judging by the red “error bars”, this analysis suggests that someone is very confident that yield changes dramatically by changing elevation by 300 ft.
In the last few years, a lot of time and money has been spent publicizing the benefits of cross-farm comparisons and analyses. “Benchmarking” has claimed its spot in ag’s top buzzwords, and while a lot of benefit can come with understanding how your operation stacks up against peer farms, conclusions need to be drawn accurately and carefully. Poor benchmarking can encompass many problems, ranging from data quality (such as aggregating incomplete or mislabeled data) to inaccurate analysis. Many readers will probably be familiar with “as planted” files that have the wrong hybrid listed, but there are also many examples of incorrect analysis that are harder to catch.
Issues with analysis methodology and visualization are often the easiest to spot. Here are some more examples we’ve seen in the market – we’ve recreated the original data to preserve the anonymity of the companies involved:
Example 2. This suggests that yield goes up at 130k and 170k seeds/acre, but somehow 160k seeds per acre is bad for your field.
Example 3. This plot suggests that a) yields go down after you reach a soil productivity index of 0.3 and b) some people are farming ground that has a negative productivity index – something that doesn’t even exist in reality!
Most farmers looking at the charts above would realize this issues and not make a decision based on the visualization output. However, it is difficult or often impossible to judge the methodology or the data quality that went into the analysis. For example, one of the most common analyses published is a ranking of yield by variety, which typically shows up as a table like this:
In this example, the farmer typically has no way to tell if the data or analysis used to make the table was sound. They thus need to trust that the data they are are using is high quality data that has been analyzed properly…a trust that is sometimes misplaced. Here are some practical takeaways to keep in mind when you are looking at benchmarking data:
1. Find companies that you trust. Finding bad data is like finding a mouse: if you spot it once, there is probably a lot more that you haven’t noticed.
2. Make sure the results you see pass the “common sense” test: do the results make logical sense? Hint: several of the conclusions you could draw from the examples shown above don’t.
3. Understand the sample set and make sure you are looking at data that is representative and relevant for your particular operation. “Yield by variety” that includes irrigated land in Nebraska, for example, isn’t that helpful if you are a dryland farmer in Kansas.
4. Remember that the value of benchmarking can be positive or negative: making a poor decision from questionable analysis can be a lot more expensive than what you paid to subscribe to the benchmarking service
5. Correlation is not causation. Here is a simple example: people plant slower on hills. If yield is lower on steep slopes, an analysis of yield vs. planting speed will make it look like planting slower is bad for yield. Also, adding more farms to the sample won’t fix the problem, so don’t fall for the line “more farms average out potential errors”.
6. Make sure you understand why the company is providing you benchmarking: is their business based on providing accurate data, or do they have another way of making money? Good benchmarking takes a lot of work and focus – it shouldn’t be a company’s afterthought.
Despite the preceding examples, we think that benchmarking is here to stay, and that it can provide a lot of real value. There is a lot of well analyzed, useful benchmarking programs out there (you can read about some of our at work at Granular here and here). However, the quality of data in the marketplace currently varies tremendously, and benchmarking as a whole runs the risk of ending up with a bad reputation. Before you make important business decisions based on a benchmarking or cross-farm analytics program, look closely at where the data comes from and how it was analyzed – not just where you show up relative to others!