Identification of the appropriate use rate is a critical first step in the herbicide development process because use rates affect product utility, market value, and the various risk assessments within the regulatory review process prior to registration. For a given herbicide to be commercially successful, it must provide consistent and sustained efficacy based on a use rate structure that meets customer requirements over a wide range of conditions. Recently, recommendations have been made that advocate the use of herbicide use rates below those outlined on registered product label text. Such advice tends to be based on field work and predictive models designed to identify specific conditions where reduced herbicide use rates are theoretically optimized as dictated by threshold values with assumed levels of commercially acceptable weed control. Unfortunately, many other studies indicate that the use of reduced herbicide rates is not without variability of herbicide efficacy and economic risk. Consequently, reduced use rate theories and related predictive models are often of limited practical value to growers. Aside from inconsistent performance, weed control strategies based on reduced herbicide use rates are not a solution to prevent or even delay target site resistance. In fact, prolonged use of sublethal use rates may select for metabolic resistance and add future weed management challenges by replenishing the weed seed bank. Much effort in terms of development time and resources are invested before product commercialization to ensure that product labels are easily understood and provide value to growers. In this regard, every effort is made to identify the lowest effective use rate that will consistently control target weeds and lead to economic optimization for both the grower and manufacturer.