We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Over the last century, the United States has witnessed three approaches to achieving better regulatory outcomes: the removal of “economic” regulations in certain sectors; regulatory impact analysis (RIA) of new “social” regulations; and retrospective analysis of existing regulations. This article reviews the rationale for each approach, the results to date, and the remaining challenges. It finds that both institutional and technical factors influence the success of reform efforts.
This study analyzed the impact of environmental regulation, specifically the “2+26” regional strategy for air quality improvement, on corporate research and development (R&D) investment in China. We developed a theoretical model based on the argument that R&D investment rises with regulation intensity. Using 2010–2019 data from China's listed companies located in the Beijing-Tianjin-Hebei region and its surrounding areas, we treated the $2+26$ policy as a quasi-natural experiment and adopted a difference-in-differences approach to explore its effect on firm R&D input. A positive association was observed between firm R&D intensity and the $2+26$ strategy's implementation in major polluting industries. Our results provide in-depth insights into the $2+26$ strategy's economic consequences, which are potentially of interest to both scholars and policymakers.
Tree care license requirements are expected to improve service quality and provider competencies. The study elicits the licensing fee Georgia firms would pay using survey data from 153 tree care firms. An empirical relationship identifies firm characteristics that influence the fee size using the inverse hyperbolic sine transformation function. Results show that respondents who propose higher annual licensing fees are on average younger, more educated, male, and likely to agree that mandatory licensing is necessary to establish professionalism in the tree care industry. Also, those often engaged in tree trimming services and firms with higher annual revenues contemplate paying higher licensing fees.
In this article, we reconceptualize, using an extended discrete and dynamic Ostrom's classification, the specific intellectual property (IP) regimes that support geographical indications (GIs) as ‘knowledge commons’, e.g. a set of shared collective knowledge resources constituting a complex ecosystem created and shared by a group of people that has remained subject to social dilemma. Geographical names are usually considered part of the public domain. However, under certain circumstances, geographical names have also been appropriated through trademark registration. Our analysis suggests that IP laws that support GIs first emerged in Europe and spread worldwide as a response to the threat of undue usurpation or private confiscation through trademark registration. We thus emphasize the nature of the tradeoffs faced when shifting GIs from the public domain to shared common property regimes, as defined by the EU legislation pertaining to GIs. In the context of trade globalization, we also compare the pros and cons of regulating GIs ex-ante rather than engaging in ex-post trademark litigation in the courts.
This paper examines the privatisation of Sydney Airport and the regime of ‘light-handed’ monitoring of service quality and airport charges that followed the sale in 2002. The arguments for privatisation are reviewed, in particular the need for increased competition and/or appropriate regulation where a former public monopoly, such as Sydney Airport, is sold. The aftermath of the privatisation of the airport has led to complaints by the major airlines and consumers of ever increasing charges for use of the airfield and for car parking and other services. This highlights that the ‘light-handed’ monitoring regime has not constrained the airport’s ability to charge monopoly rents. The aftermath of privatisation has resulted in labour shedding, outsourcing and a focus on cost minimisation by the airport’s management.
There can be a serious tension between the commitment to cost-benefit analysis and a realistic appreciation of the limits of official knowledge. Without significant efforts to reduce those limits, that analysis might be inadequately informed. Whenever regulators face significant informational deficits, or what is sometimes called “the knowledge problem,” it is important to explore tools that take advantage of what the private sector knows; market-friendly tools, such as economic incentives, have important advantages on that count. An advanced regulatory system should also try to reduce the knowledge problem through three routes: (i) creative use of notice-and-comment rulemaking; (ii) retrospective analysis of regulations and their costs and benefits; and (iii) advance testing, as a way of informing ex ante analysis. For the future, the most promising approach is (iii).
Three common misconceptions persist about federal regulations. The first misconception is that most new regulations concern the environment, but in fact, only a small minority of regulatory flows are environmental. The second misconception is that regulators offer reasonable justifications and quantitative evidence for the majority of regulations. However, quantitative estimates rarely appear in published rules, negating the impression given by executive orders and Office of Management and Budget guidance, which require cost-benefit analysis (CBA) and clearly articulate sound economic principles for conducting CBA. Environmental rules have relatively higher-quality CBAs, at least by the standards of other federal rules. The third misconception, which is particularly relevant to the historic regulations promulgated during the COVID-19 pandemic, is that regulatory costs are primarily clerical, rather than opportunity or resource costs.
In the United States, over 70% of milk production is priced under Federal Milk Marketing Orders (FMMOs). A primary purpose of FMMOs is to facilitate orderly allocation of milk as a limited, perishable resource among alternative uses. Fundamental to FMMOs are the regulatory prices applicable to milk used in cheese and whey (Class III), and nonfat dry milk and butter (Class IV). This work examines a novel milk pricing method based on the concept of opportunity cost for milk used in cheese and whey. This novel method may improve the functioning of FMMOs and the U.S. dairy industry.
Food regulations protect consumer health, mitigate environmental concerns, and promote animal welfare, but they can also hinder innovation, limit entrepreneurship, and generate higher consumer prices. This study examines the number of federal and state regulatory restrictions affecting the beef, pork, poultry, sheep, goat, and seafood industries, including processing, wholesale distribution, and retail sales. We also examine state regulatory heterogeneity associated with animal protein products. Our results suggest that protein supply chains have become subject to tens of thousands of regulatory constraints over the past half-century. We also find substantial heterogeneity in the number of state restrictions associated with animal production, indicative of large differences in the amount of administrative law across states. Results highlight that the patchwork approach of U.S. food policy creates overlapping, cumbersome guidelines for manufacturers, and given the interconnectivity of modern food supply chains, the framework can create additional hurdles for interstate commerce.
The historical dynamics of entry and exit in the financial exchange industry are analyzed for a panel of 327 US exchanges from 1855 through 2012. We focus on economic, technological and regulatory factors. Using novel panel data evidence, we empirically test whether these factors are consistent with existing financial theories. We find that US exchanges are more likely to exit per year after the passage of the Securities Exchange Act. The telephone, literacy and regulation are robust predictors of financial exchange dynamics, whereby an upward trend in literacy is an important driver of exchange entry.
This paper conducts a novel empirical analysis of the effect of environmental regulation on local pollution emissions by taking 84 cases of local legislation among 31 provinces in China during 1990–2009. We combine the matching methodology and difference-in-difference method to estimate the causal effect of provincial environmental legislation. Our estimation uncovers that there is no significant pollution abatement effect, however, environmental legislation helps to decrease local pollution emission only for those provinces that have stricter enforcement. Such results remain robust while considering the time lag effect, different types of pollutants, choice of different comparison groups and using of synthetic control method. Generally, our study shows the importance of the enforcement for environmental legislation in China.
United States Environmental Protection Agency (USEPA) has regulated drinking water since the 1974 Safe Drinking Water Act (SDWA). Congress directed it to achieve three conflicting goals: (i) establish stringent nationwide standards, (ii) ensure that these standards are both technologically and economically feasible, and (iii) accommodate significant differences in cost among water systems of different sizes with different water sources. USEPA chose to emphasize goal (i) at the expense of (ii) and (iii). In 1986, Congress intensified its preference for (i), was silent concerning goal (ii), and criticized USEPA for failing to achieve goal (iii). In lieu of economic feasibility, the Agency substituted “affordability,” defined as expenditures up to 2.5 % of national median household income irrespective of the benefits. This imposed deadweight losses, and substantial inequities on rural areas, low-income communities, and low-income households generally. In 1996, Congress directed USEPA to use benefit-cost analysis positively and normatively. Regulations issued since 1996 do not appear to comply, however. A review of post-1996 drinking water standards indicates that most were certified by USEPA as having benefits that justified costs, but these determinations were unsupported by the Agency’s own regulatory impact analyses. This article proposes that USEPA define by regulation that “economic feasibility” means marginal benefits exceed marginal costs for the smallest water system subject to SDWA, and that all future drinking water standards must be economically feasible. Economic efficiency would be greatly enhanced and the pervasive inequities of “affordability” greatly diminished. Unlike “affordability,” this definition is objective and compatible with lay intuition about the meaning of key regulatory terms.
In the Dutch health care system, health insurers negotiate with hospitals about the pricing of hospital products in a managed competition framework. In this paper, we study these contract prices that became for the first time publicly available in 2016. The data show substantive price variation between hospitals for the same products, and within a hospital for the same product across insurers. About 27% of the contract prices for a hospital product are at least 20% higher or lower than the average contract price in the market. For about half of the products, the highest and the lowest contract prices across hospitals differ by a factor of three or more. Moreover, hospital product prices do not follow a consistent ranking across hospitals, suggesting substantial cross-subsidization between hospital products. Potential explanations for the large and seemingly random price variation are: (i) different cost pricing methods used by hospitals, (ii) uncertainty due to frequent changes in the hospital payment system, (iii) price adjustments related to negotiated lumpsum payments and (iv) differences in hospital and insurer market power. Several policy options are discussed to reduce variation and increase transparency of hospital prices.
Food and Drug Administration (FDA) published a final regulation in 2004 that requires pharmaceutical manufacturers to place linear bar codes on certain human drug and biological products. The intent was that bar codes would be part of a system where healthcare professionals would use bar code scanning equipment and software to electronically verify against a patient’s medication regimen that the correct medication is being given to the patient before it is administered, which could ultimately reduce medication errors. In the 2004 prospective regulatory impact analysis, FDA anticipated that the rule would stimulate widespread adoption of bar code medication administration technology among hospitals and other facilities, thereby generating public health benefits in the form of averted medication errors. FDA estimated that annualized net benefits would be $5.3 billion. In this retrospective analysis, we reassess the costs and benefits of the bar code rule and our original model and assumptions. Employing the most recent data available on actual adoption rates of bar code medication administration technology since 2004 and other key determinants of the costs and benefits, we examine the impacts of the bar code rule since its implementation and identify approaches to improve the accuracy of future analyses. In this retrospective study, we use alternative models of health information technology diffusion to create counterfactual scenarios against which we compare the benefits and costs of the bar code rule. The magnitudes of the costs and benefits of the 2004 rule are sensitive to assumptions about the counterfactual technology adoption rate, with the upper-bound range of calculated annualized net benefits between $2.7 billion and $6.6 billion depending on the baseline scenario considered.
Disclaimer: The findings, interpretations, and conclusions expressed in this article are those of the authors in their private capacities, and they do not represent the views of the Food and Drug Administration.
While regulators, firms, and the courts must all be able to interpret regulations to best address economic and social issues, regulatory interpretation may vary greatly across parties. After introducing a framework to explain the impact of the complexity of written regulations and the complexity of the regulatory environment on regulatory interpretation, this paper utilizes regulatory examples to explore the challenges associated with regulatory interpretation. Several recent initiatives designed to improve regulatory efficacy are examined to assess potential methods available to reduce challenges associated with regulatory interpretation. When considered with the public policy implementation literature and research on networks in public policy, several implications emerge from the consideration of regulatory interpretation and recent regulatory initiatives. Regulators should pursue strategies to minimize the number of possible interpretations in the design of regulation and seek improved regulatory mechanisms to alleviate regulatory interpretation challenges. Furthermore, theoretical models should acknowledge regulatory interpretation to better assist in the design and implementation of regulation.
Retrospective, or ex post, analysis of U.S. federal regulation aims to rigorously document regulatory outcomes using cost, benefit, and distributional metrics. This paper presents nine new case studies involving a total of 34 comparisons of ex ante and ex post estimates from a diverse group of environmentally oriented rules. Despite the potential for selection bias and other limitations of the case study approach, the results suggest a slight tendency to overestimate both costs and benefits (or effectiveness) of regulation. This paper considers various analytic issues relevant to developing credible baselines for comparison, and offers policy lessons regarding the design of emissions trading programs along with approaches for incorporating uncertainty into both preregulatory studies and policy designs. Recommendations to facilitate and support future retrospective analyses are also presented.
Executive Order (EO) 13771 on “Reducing Regulation and Controlling Regulatory Costs” introduces a new regulatory budgeting system in the U.S. federal rulemaking process. International experience suggests that the new rule, aimed both at reducing the number of regulations and the volume of regulatory costs, will focus on a subset of regulatory impacts, most certainly the direct costs imposed by regulation on businesses, or even a subset thereof. The paper discusses possible ways to make sense of the new rule, without undermining the soundness of benefit-cost analysis mandated by EO12866. The paper concludes that the new system, while potentially promoting more retrospective regulatory reviews, will risk fundamentally affecting the quality of regulation in the United States, generating frictions and inefficiencies throughout the administration, to the detriment of social welfare.
Regulatory impact analyses (RIAs) weigh the benefits of regulations against the burdens they impose and are invaluable tools for informing decision makers. We offer 10 tips for nonspecialist policymakers and interested stakeholders who will be reading RIAs as consumers.
1. Core problem: Determine whether the RIA identifies the core problem (compelling public need) the regulation is intended to address.
2. Alternatives: Look for an objective, policy-neutral evaluation of the relative merits of reasonable alternatives.
3. Baseline: Check whether the RIA presents a reasonable “counterfactual” against which benefits and costs are measured.
4. Increments: Evaluate whether totals and averages obscure relevant distinctions and trade-offs.
5. Uncertainty: Recognize that all estimates involve uncertainty, and ask what effect key assumptions, data, and models have on those estimates.
6. Transparency: Look for transparency and objectivity of analytical inputs.
7. Benefits: Examine how projected benefits relate to stated objectives.
8. Costs: Understand what costs are included.
9. Distribution: Consider how benefits and costs are distributed.
10. Symmetrical treatment: Ensure that benefits and costs are presented symmetrically.
Numerous regulatory reform proposals would require federal agencies to conduct more thorough economic analysis of proposed regulations or expand the resources and influence of the Office of Information and Regulatory Affairs (OIRA), which currently reviews executive branch regulations. Such reforms are intended to improve the quality of economic analysis agencies produce when they issue major regulations. We employ newly gathered data on variation in current administrative procedures to assess the likely effects of proposed regulatory process reforms on the quality of agencies’ regulatory impact analyses (RIAs). Our results suggest that greater use of advance notices of proposed rulemakings for major regulations, advance consultation with regulated entities, use of advisory committees, and expansion of OIRA’s resources and role would improve the quality of RIAs. They also suggest pre-proposal public meetings with stakeholders are associated with lower quality analysis.
Applying benefit-cost analysis in the White House regulatory oversight process served as a basic mission of the Council on Wage and Price Stability (CWPS) during its seven-year lifespan (1974–1981). This paper reviews that CWPS experience, which involved filing comments in over 300 proceedings at more than 25 federal regulatory agencies. The paper draws on those CWPS public comments (filings), identifying persistent and pervasive deficiencies in the economic analysis regulators then and now often use as support for new regulation. CWPS filings fostered greater acceptance of benefit-cost analysis in regulatory decisions; such analysis is now required by executive order.