So my parents are tired of watering by hand. I offered to help set up some
irrigation before I take off for parts unknown.
I've measured the area they need irrigated, figured out the maximum required inches per week, and converted the resulting acre-inches into gallons.
Where I'm having trouble with the math is figuring out how long the irrigation can run without
runoff. I realize this is a bit of an exercise in aggravation; it won't be particularly hard to observe this once the irrigation is installed, and act accordingly from there. In any case, the soil is such a mixed bag it will probably vary substantially between beds. However, now that it has annoyed me, I want to sort out the math even if it is of dubious utility.
I found this page with an example of what I need... sort of.
http://extension.psu.edu/plants/vegetable-fruit/news/2013/determining-how-long-to-run-drip-irrigation-systems-for-vegetables
So, according to their example and chart, an available water holding capacity of 0.12 inches of water/inch depth of soil and a flow-rate of 0.45GPM per 100' means a max time per application of 110 minutes. They specify a 10-inch deep rootzone, and that irrigation is happening at 50% soil depletion.
0.45GPM*110 minutes=49.5 gallons for a 100' row 30" wide, aka 250sf.
A cubic foot of water would cover 12 feet with one inch of water. With 7.48 Gallons per cubic foot, this gives us 6.62 cubic feet of water, or ~79.4 square feet at 1" depth. Spread over the whole 250sf row, this is 0.3176" of water.
Using the holding capacity provided for the example soil of 0.12 inches of water per inch of soil depth, multiplied by the stated depth of 10", multiplied by the area of the row, multiplied by the available water holding capacity:
0.12*10*250*0.5=150 inches, which spread over the entire row would become 0.6" of water.
Obviously something is wrong here, since these numbers
should match, no? Anyone see where I've gone wrong, presuming it is I who have gone wrong? Or, have a better example?