News in English

Missing the Trees for the Forest in Industrial Policy

Missing the Trees for the Forest in Industrial Policy

A new manual for industrial policy, while valuable, makes several glaring omissions.

(Gorodenkoff/Shutterstock)

Industrial Policy for the United States: Winning the Competition for Good Jobs and High-Value Industries, by Marc Fasteau and Ian Fletcher. 849 pages with index. Cambridge University Press 2024.

Although many (including this writer) will reject its conclusions, Fasteau and Fletcher’s compendium, Industrial Policy for the United States: Winning the Competition for Good Jobs and High-Value Industries, will serve as the standard reference work on industrial policy in the foreseeable future. Its 800 pages provide a thorough survey of all the major economies’ experience with government planning, including a sober assessment of successes and failures. They rightly emphasize the key role of military R&D. Nonetheless, they miss the trees for the forest, so to speak—namely, the singular contributions of maverick inventors. Innovation can’t be budgeted and scheduled, only fostered and encouraged. And that depends on a delicate balance between government support and private initiative.

The authors want the government to remake the economy, with a new corps of federal officials empowered to direct investment to favored industries. In their enthusiasm, they ignore the gross deficiencies of the most ambitious piece of industrial policy in decades, namely the Biden CHIPS and Science Act of 2022. And they naively propose a devaluation of the U.S. dollar to promote exports without considering the ways in which cheapening the currency adversely affects manufacturing. 

“In 2021 and 2022, Biden proposed and Congress enacted the Bipartisan Infrastructure Act (BIA), the CHIPS and Science Act (CHIPS), and the Inflation Reduction Act (IRA). These ambitious new programs, combined with their explicitly pro-industrial policy rationales, were a big step forward,” the authors write. They worry that the $170 billion CHIPS Act wasn’t big enough: “The Act was a major advance, but the aid it provides, while sizeable, is dwarfed by that provided by Taiwan, Korea, and China.”

The CHIPS Act subsidies prompted $450 billion in planned investments, according to the Semiconductor Industry Association, but the industry encountered crippling shortages of skilled labor, engineers, and infrastructure. The cost of building new industrial plants jumped by 30 percent in little more than a year, and unfilled construction job openings jumped to an all-time record in 2023. Plant openings by TSMC, Samsung, and other fabricators were delayed by years. Intel took $8.5 billion in subsidies under the CHIPS Act and shortly thereafter laid off 15,000 workers and cut capital expenditures by 20 percent.

The CHIPS Act turned out to be a horrible example of how industrial policy can go wrong. Apart from its shoddy implementation, Biden’s venture into industrial policy failed to encourage research into new semiconductor technologies that promise increases of computing speed by orders of magnitude. The authors discuss molecular electronics, which, if successful, will create circuits from individual molecules rather than silicon wafers, but do not mention the absence of support for such technologies in the CHIPS Act.

Perhaps the serried ranks of federal officials proposed by the authors would have foreseen these bottlenecks, but Fasteau and Fletcher did not. The term “skilled labor” appears just five times in the book and only once with reference to the United States. American manufacturers invariably cite the lack of skilled personnel as the single biggest constraint on expansion. A worker with a high school diploma and a year’s training can earn $60,000 a year operating a computer-controlled machine, but this work requires proficiency in high-school math (for example, trigonometry). Less than a quarter of American high school students are rated proficient, according to the Department of Education, and they aren’t looking for factory work.

The authors mention Germany’s apprenticeship system as an element of that country’s industrial policy, but are silent about the abysmal state of American secondary education. High schools used to teach industrial skills; I still have the draftsman set my father used at a Brooklyn public school before starting as a machinist’s apprentice at the Brooklyn Navy Yard. Nor do Fasteau and Fletcher mention that just seven percent of US undergraduates major in engineering, vs. a third in China, which now graduates 1.2 million engineers each year, vs. 200,000 in the United States. They provide detailed reports of university programs in quantum computing and nanotechnology, but ignore the biggest single problem now facing American industry.

The role of the military in promoting innovation is a central theme in their account. “The shadow of Mars is long,” they observe. “The Englishman Henry Bessemer,” who invented modern steelmaking, “had been trying to make a cannon strong enough to fire new rifled artillery shells.” They rightly draw attention to the national security imperative in inspiring innovation, but their account has an important lacuna.

A set of breakthroughs in the 1970s—optical networks, CMOS manufacturing of integrated circuits, and the Internet, among others—that launched the Digital Age. As former Deputy Secretary of Defense Bob Work explained:

In 1973, the Yom Kippur War provided dramatic evidence of advances in surface-to-air missiles, and Israel’s most advanced fighters, flown by the top pilots in the Middle East, if not among the world’s best, lost their superiority for at least three days due to a SAM belt. And Israeli armored forces were savaged by ATGMs, antitank guided munitions.

U.S. analysts cranked their little models and extrapolated that [if] the balloon went up in Europe’s central front and we had suffered attrition rates comparable to the Israelis, U.S. tactical air power would be destroyed within 17 days, and NATO would literally run out of tanks.

Vietnam fell two years later, and the American military went back to the drawing boards. By 1978, advances in chip manufacturing put into the cockpits of fighter planes computers that could run lookdown radar. By 1982, American avionics helped Israel to destroy the Syrian air force in the Beqaa Valley “turkey shoot.”

Their account of the role of the Defense Advanced Research Projects Agency (DARPA) and other government agencies in promoting industrial innovation is extensive, although it misses some decisive points.

Part of the problem is that federal R&D funding has shrunk as a share of government spending and GDP. “Federally funded R&D—the spending that generates fundamental technological breakthroughs—peaked at 1.9 percent of GDP in 1962, fell to 0.7 percent by 2020, and as of mid-2024 is only at the beginning of a possible turnaround,” they note. DARPA funding made possible Sergei Brin and Larry Page’s Google search algorithm, the voice recognition system later branded as Apple’s Siri, the Internet, the analog-to-digital transformation that enabled the smartphone, as well as GPS, stealth technology, night vision, smart weapons and a vast number of other innovations.

When the U.S. military is compelled to innovate as a matter of national security, it funds research at the frontier of physics. This puts technology in the hands of entrepreneurs who want to create new products. The Achilles’ heel of industrial policy is rent-seeking by corporations. When technology changes incrementally, industry easily corrupts the officials responsible for doling out federal money by offering them future employment. But when national security demands breakthroughs at the frontier of physics, entrepreneurs gain access to technology that challenges the existing business structure. That is what happened during the 1980s, when startups like Cisco, Intel, Apple, and Oracle became the new corporate giants.

Federal bureaucrats do a poor job of picking winners in the business world, and they don’t do a good job of forecasting technological breakthroughs, either. Although virtually every important innovation of the Digital Age began with DARPA funding, the most important of these inventions had little to do with the initial motivation for the project. An example related by Dr. Henry Kressel, the former head of RCA Labs, is the semiconductor laser: The military wanted to illuminate battlefields for night fighting. Kressel and his team took DARPA’s money and perfected a laser that could transmit vast quantities of information through optical cables, making the Internet possible. 

Maverick engineers with a mind of their own rather than federal planners discovered the most important innovations. The great corporate labs at RCA, IBM, GE, and the Bell System formed half of a public-private partnership, in which the government paid for basic research, but private capital took the risk of commercialization. 

Fasteau and Fletcher draw attention to America’s declining share of manufacturing in GDP and its widening trade deficit. They propose withdrawing from the World Trade Organization, rejecting any new free trade agreements, and raising tariffs, along with a devaluation of the U.S. dollar. They caution against disruptive, sudden action:

Tariff rate quotas and tariffs phased in over time should be used to nurture industries the U.S. is attempting to develop, is in danger of losing, or is trying to regain. For example, the federal government’s current $54 billion effort to rebuild U.S. capability in semiconductors should be supported by a staged tariff and quota policy. Said policy should track along with and protect the development of American production capacity, but not prematurely burden US users of advanced chips that domestic manufacturers are not yet capable of making.

Caution is called for indeed, given that we now import most of our capital goods. To reduce dependence on imports, we must invest in new capacity, which means increasing imports of capital goods for some years before replacing them with domestic production in the future. 

Less convincing is the authors’ plaidoyer for a cheap dollar. The steepest decline in manufacturing employment in U.S. history occurred during the 2000s while the US dollar’s real effective exchange rate fell sharply. That does not imply that a falling dollar caused the decline in employment, but rather that more important factors were at work. Perhaps the most important price point in capital-intensive investment is the cost of capital itself. Stable currencies generally are associated with a low cost of capital, because currency depreciation promotes inflation, and inflation adds both a surcharge and a risk premium to the cost of capital. 

When corporations write investments off taxable income over years, inflation reduces the value of depreciation, and thus increases the effective corporate tax rate. For that matter, Fasteau and Fletcher praise Japan’s use of accelerated depreciation to promote investment, but have nothing to say about the subject as it might apply to the United States. Tax relief for investment might prove a more effective incentive for manufacturing investment than tariffs. 

Despite these flaws, Industrial Policy for the United States belongs in the library of every policymaker concerned about the state of U.S. industry. 

The post Missing the Trees for the Forest in Industrial Policy appeared first on The American Conservative.

Читайте на 123ru.net