Agenda 2016: Reviving U.S. Economic Growth

The Progressive Policy Institute (PPI) teamed up with Columbia University’s Richard Paul Richman Center for Business, Law, and Public Policy to co-host a compelling symposium Nov. 6-7 in New York on revitalizing the U.S. economy. The event featured a distinguished roster of Richman Center economists and scholars, as well as PPI analysts and special guests, and more than two-dozen top policy aides to Members of Congress, Governors, and Mayors.

Held on Columbia’s Manhattan campus, the symposium examined the U.S. economy’s recent performance, as well as the causes of the long-term decline of productivity and economic growth. Against the backdrop of the 2016 election debate, the participants grappled with specific ideas for unleashing more economic innovation, modernizing infrastructure, reforming taxes, improving regulation, expanding trade and reducing inequality by ensuring that all children have access to high-quality public schools.

The discussions, which were off-the-record to encourage maximum candor, featured the following speakers and topics:

  • An overview of the U.S. economy’s recent performance by Abby Joseph Cohen, President of the Global Markets Institute and Senior Investment Strategist at Goldman Sachs.
  • A roundtable on key elements of a high-growth strategy, led by Michael Mandel, Chief Economic Strategist at PPI, Andrew Stern, former head of the Service Employees International Union and now Ronald O. Perelman Senior Fellow at the Richman Center, and
Philip K. Howard, Founder of Common Good, a nonpartisan reform coalition. The conversation touched on ways to improve the regulatory environment for innovation, including reducing regulatory accumulation and requiring faster permitting for big infrastructure projects, as well as a lively debate on the future of work in a tech-driven knowledge economy.
  • An insightful macroeconomic analysis of why productivity and economic growth have slowed, by Pierre Yared, Associate Professor at the Columbia Business School and Co-director of the Richman Center. Yared highlighted three potential contributors to the slowdown: labor demographics and participation; “capital intensity” or business investment; and the “production efficiency” of U.S. companies.
  • A detailed examination of the impact of energy innovation—from the shale boom to renewables and the construction of a new, “smart” grid—on jobs and economic growth. Leading this segment were Jason Bordoff, formerly energy advisor to President Obama and Director of Columbia’s Center on Global Energy Policy, and Derrick Freeman, Director of PPI’s Energy Innovation Project.
  • A dinner conversation at the Columbia Club with Edmund Phelps, the 2006 Nobel Laureate in Economics and Director of Columbia’s Center on Capitalism and Society at Columbia University. Drawing on his recent book, Mass Flourishing: How Grassroots Innovation Created Jobs, Challenge and Change, he stressed the importance of indigenous innovation in creating the conditions for broad upward mobility. He also emphasized the crucial role of “modern” or individualistic cultural values in sustaining the mass innovation and entrepreneurship America needs to flourish again.
  • A detailed look at business taxation and reform as a potential driver of economic growth. It featured Michael Graetz, Alumni Professor of Tax Law at Columbia Law School, David Schizer, Dean Emeritus and the Harvey R. Miller Professor of Law and Economics at the Columbia Law School and Co-director at the Richman Center, as well as PPI’s Michael Mandel. The discussion ranged widely over global tax frictions, including the OECD’s new “BEPS” project; the need for corporate tax reform; “patent boxes” and mounting U.S. interest in consumption taxes.
  • A roundtable on trade and productivity growth with Ed Gerwin, PPI Senior Fellow for Trade and Global Opportunity and the versatile Michael Mandel. Noting President Obama’s controversial call for a Trans-Pacific Partnership, Gerwin stressed the agreement’s potential for “democratizing” trade by making it easier for U.S. small businesses to connect with customers abroad. Mandel underscored another PPI priority: raising awareness among policymakers of the growing contribution of cross-border data flows to growth here and abroad, and the need to push back against proposals that would impede “digital trade”
  • A luncheon presentation on “financial regulation after the crisis” by Jeffrey Gordon, Richard Paul Richman Professor of Law at Columbia Law School and Co-director of the Richman Center. Gordon described the new regime put in place by Dodd-Frank and other rules to guard against “systemic risk” of another financial meltdown, and suggested its “perimeter” may been to be expanded beyond banks.
  • The symposium’s final panel featured a vigorous discussion on K-12 education reform and the economy. The discussants were Jonah Rockoff, Associate Professor at the Columbia Business School and David Osborne, who directs PPI’s Reinventing America’s Schools Project, and is a co-author of the seminal “Reinventing Government.” Rockoff highlighted research showing that the returns to school improvement are enormous, and recommended reforms that could increase school quality. Osborne traced the evolution of school governance in America, and offered detailed looks at new models emerging in cities like New Orleans and Washington, D.C., both of which are leaders in the public charter school movement.

The symposium gave the policy professionals who participated a rare opportunity to delve deeply into complicated economic realities, guided by presenters of extraordinarily high caliber. The conversations were highly illuminating and will inform PPI’s work on Agenda 2016—a new blueprint for reviving U.S. economic dynamism and opportunity.

Wall Street Journal: Broadband Investment is Falling

PPI Senior Fellow Hal Singer’s analysis on the impact of the FCC’s net neutrality ruling was cited in the Wall Street Journal:

Before Obamanet went into effect, economist Hal Singer of the Progressive Policy Institute predicted in The Wall Street Journal that if price and other regulations were introduced, capital investments by ISPs could quickly fall from the $77 billion invested in 2014—between 5% and 12% a year, according to his forecast.

Now Mr. Singer has analyzed the latest data, and his prediction has come true. He found that in the first half of 2015, as the new regulations were being crafted in Washington, major ISPs reduced capital expenditure by an average of 12%, while the overall industry average dropped 8%. Capital spending was down 29% at AT&T and Charter Communications, 10% at Cablevision, and 4% at Verizon. ( Comcast increased capital spending, but on a new home-entertainment operating system, not broadband.)

Until now, spending had fallen year-to-year only twice in the history of broadband: in 2001 after the dot-com bust, and in 2009 after the recession. “In every other year,” Mr. Singer wrote for Forbes, “ISPs—like hamsters on a wheel—were forced to upgrade their networks to prevent customers from switching to rivals offering faster connections.”

Continue reading at the Wall Street Journal.

The Hill: No injury. No lawsuit. No service.

The Supreme Court this month received the first round of briefing in a case that could cure one of the newest, most significant abuses in our civil justice system: massive class actions that lawyers file on behalf of people who are not injured. In these cases, the class action plaintiffs’ lawyers use novel legal theories and damage models to get their classes certified and then count on companies to settle the claims and pay them attorney fees – sometimes for more than the class members will end up collecting from the settlement.

The whole point of civil litigation is to make people whole for their losses. Any person who is not injured and has no loss to be corrected should have his or her claim dismissed. The person has no substantive legal basis for the claim, and Article III of the U.S. Constitution gives federal courts jurisdiction only over cases where people allege actual injury traceable to the defendant. But, what happens when uninjured people are nonetheless swept into federal class actions?

This is the issue before the Supreme Court in Tyson Foods, Inc. v. Bouaphakeo. The plaintiffs’ counsel used a controversial damages model to turn discrete wage-and-hour claims for some Tyson employees into a much larger class action. They created an “average employee,” claiming that this “average employee” would be due overtime pay if the time taken to put on and take off protective gear was included in the work week. They then sought to have every class member – some 3,300 people – paid the same overtime as the “average employee,” regardless of how much the real employees actually worked, spent putting on and taking off gear, or were paid.

The problem is that hundreds of class members had no injury at all. It was clear under the plaintiffs’ own statistical sampling model that these employees were fully paid, even accounting for the time to put on and take off gear. Yet, the district court certified the case as a class action with these uninjured people. At trial, the jury found that the modeling majorly overstated the damages and about half of the class had no or only a de minimis injury. Yet, the court allowed all class members, including the uninjured, to get the same pro rata share of the award.

Continue reading at The Hill.

Does the FCC’s Open Internet Order Survive a Cost-Benefit Test? These 13 Economists Don’t Think So.

Yesterday, a stellar constellation of regulatory economists—including three economists affiliated with the Progressive Policy Institute—submitted an amicus brief to the D.C. Circuit Court of Appeals, demonstrating that the Federal Communications Commission’s 2015 Open Internet Order failed a cost-benefit test.

How could this happen?

When proposing a remedy to address a perceived market failure, a regulatory agency may fail a cost-benefit test in three ways. First, the agency can overstate the benefits of its proposed remedy. Second, the agency can understate the costs of its proposed remedy.

Third, and a bit less obvious, the agency can ignore a less-restrictive alternative that would generate the same purported benefits but at a lower cost, thereby rendering its proposed remedy inefficient. For example, if the net benefits of a proposed remedy are $10 million per year, but a less-restrictive alternative generates net benefits of $15 million, then the proposal fails a cost-benefit test, even though the proposed remedy would have generated benefits in excess of costs.

The FCC committed all three errors in its Open Internet Order (OIO). As Chris Cillizza of the Post says in his recurring award for Worst Week in Washington, “Congrats, or something.”

The amicus brief explains in great detail how the FCC committed the first two errors.

In terms of overstating benefits, the OIO fails to consider that the profitability of (and thus the incentive to engage in) discriminatory conduct vis-à-vis content providers depends on whether the Internet service provider (ISP) could generate higher profits from the promoted (affiliated) products to cover the lost margins from departing broadband customers. The anticompetitive behavior feared by the Commission has simply not come to pass, which explains why the OIO is hard-pressed to cite any recent examples of consumer harm. A very limited number of service disruptions or degradations have actually occurred—among literally millions of opportunities for such behavior—and many of these have been dealt with expeditiously through private negotiations.

And in terms of understating costs, the OIO ignores or dismisses the economic evidence of the impact of Title II on investment in the late 1990s and early 2000s, and thereby dismisses the very real threat to ISP investment. Rather than ground its findings on economic scholarship, the OIO relies instead on the casual empiricism of an advocacy group that operates outside of the constraints of academic reputations, to reach the extraordinary conclusion that telco investment was “55 percent higher under the period of Title II’s application” than in the later period. These results hinge on which years are included in the Title II era: If one includes the years 1999 and 2000 as part of the pre-2005 period, then removal of Title II appears to have caused a decline in Bell investment. But those early years are associated with the dot.com boom and long-haul fiber glut, and it is difficult to remove Bell investments in backbone infrastructure from the capex figures.

The amicus brief spends less time on the third element of cost-benefit, largely due to a 4000-word limitation. So more on that here.

The OIO casually dismisses a less-restrictive alternative for handling paid priority disputes—namely, case-by-case enforcement—as being “too cumbersome” to enforce, despite the fact that: (1) the 2015 OIO itself embraces case-by-case review to address interconnection disputes and other conduct such as zero-rating; (2) the 2010 Open Internet Order embraced case-by-case to address paid priority disputes; (3) the FCC’s May 2014 Notice of Proposed Rulemaking would have permitted ISPs and content providers to engage in “individualized bargaining” subject to ex post review; and (4) the FCC relies upon case-by-case to adjudicate discrimination complaints against traditional video distributors. Why is this conduct different from all other conduct?

Recognizing this disparate treatment of paid priority and interconnection, the OIO argues that case-by-case enforcement “is an appropriate vehicle for enforcement where disputes are primarily over commercial terms and that involve some very large corporations. . . .” (paragraph 29). But interconnection disputes can involve small content providers as well. And if the concern is an asymmetry in litigation resources, the case-by-case regime can level the playing field by shifting evidentiary burdens and providing interim relief.

Indeed, the 2010 Open Internet Order considered and rejected a “flat ban” on paid priority in favor of a case-by-case approach; embracing the ban in 2015 presumably pushed the FCC towards its dreaded reclassification decision. This dramatic policy reversal begs the question: What happened in the intervening five years that caused the Commission to lose confidence in case-by-case adjudication for paid priority? The OIO does not give an answer.

It would seem that an overt and pronounced shift in regulatory policy would necessitate a clear and confident finding that such an alternative policy approach toward the Internet would produce better results—more innovation, more investment, and more consumer benefits. When viewed with an economic lens, the OIO fails a basic cost-benefit analysis.

The Sacramento Bee: Science, not politics, should drive California regs on BPA

As the presidential campaign season gets underway, it reminds us how much we loath the politics of fear mongering. Half-truths and half-baked policy proposals have become staples of modern campaigns. You betcha!

Until recently, there was a difference between campaigning and governing. Governments are supposed to base their decisions on hard facts and real science. In today’s hyperpoliticized culture, though, the regulatory process can also get hijacked by special-interest groups armed with “narratives” that are simple, emotional and deeply misleading.

California, which is widely known for its progressive politics, should shun governing by scare tactics in defining today’s progressive vision for regulation. As Vice President Al Gore did with the National Partnership for Reinventing Government, progressives should grab the pragmatic view that agencies get smart on an issue, develop targeted regulations and use their authority to solve real problems.

California voters back this vision. They recently voted against requiring warnings on genetically modified foods as unwise regulation. The Obama administration has since concluded the fears of GM products are unwarranted. In April, the U.S. trade representative chastised European regulators for allowing “opt-outs” of U.S. imports of GM products as “ignore(ing) science-based safety and environmental determinations.”

Chalk one up for California’s progressive governing.

On the other side of the ledger is California’s decision last month to add bisphenol A, or BPA, to the list of toxicants under Proposition 65. BPA has been used to coat the inside of bottles and cans since the 1960s to keep harmful germs from growing inside them.

Continue reading at The Sacramento Bee.

Read more here: https://www.sacbee.com/opinion/op-ed/article24896143.html#storylink=cpy
Read more here: https://www.sacbee.com/opinion/op-ed/article24896143.html#storylink=cpy

Mandel: Eliminating an Obsolete Regulation at the FCC (Updated)

Update (6/11/15): PPI applauds the FCC’s adoption of the “effective competition” order on June 2 (explained below). This order acknowledges the reality that on most cable systems, the video channels subject to “effective competition” from other providers, both satellite and landline. The FCC order says in part: “As a result, each franchising authority will be prohibited from regulating basic cable rates. unless it successfully demonstrates that the cable system is not subject to Competing Provider Effective Competition.”

This is not the FCC making new law…rather, this is the FCC enforcing the provisions of existing law, which clearly states the conditions under which basic cable service rates can be freed from local regulation.  Given the importance of eliminating or rewriting outmoded regulations wherever possible, the FCC has done the right thing.


 

5/13/15

PPI favors the elimination or rewriting of outmoded regulations wherever possible. We believe that clearing the deadwood of obsolete rules is a win-win for consumers, workers, and businesses, allowing regulators to focus limited resources on more important issues while freeing companies to innovate faster.

That’s why we strongly favor FCC Chairman Tom Wheeler’s proposal to streamline the “effective competition” rule for cable video providers. Cable television has long been one of the most regulated industries in the economy, including regulation of their rates by local authorities. The justification for such price controls—not acceptable for most industries—was the lack of meaningful competition from other video providers.

But the world has changed. Today many if not most cable video systems face a wide range of competitors from satellite providers such as DISH and telecom companies such as AT&T and Verizon, not to mention new internet-based video services such as Netflix and Amazon.

The legislation governing cable operators allows them to be relieved of some regulatory burdens—including rate regulation by local authorities–if the FCC rules that they face “effective competition.” The legislation includes several possible tests for effective competition, including a satellite video provider or other competitor having 15% of the pay video market, or if a phone company is offering video service in the area.

These hurdles are not hard to reach, given the prevalence of satellite and other video competitors. As a result, the FCC has ruled in favor of effective competition on almost all the hearings on this subject since 2013.

Nevertheless, up to now, cable video companies have had to go through a long and burdensome process to get regulatory relief. That is why Wheeler is proposing to simplify the process by adapting it to market realities. Challengers would have to demonstrate that effective competition did not exist in a particular location. The net result is that a larger number of cable video providers would have greater freedom to compete and innovate.

Given the amount of competition to cable, it is unlikely that cable video rates would suddenly jump. After all, with the prevalence of alternatives, and subscriber growth having topped out, why should cable companies drive away customers?

We have had disagreements with Chairman Wheeler, particularly around the Open Internet issue. But on this issue, his approach to cleaning up the regulatory process makes excellent sense for both consumers and companies.

The Wall Street Journal: How the FCC Will Wreck the Internet

The Federal Communications Commission injected a considerable amount of uncertainty into the high-tech sector in February when it reclassified Internet service providers (ISPs) as public utilities. If it is upheld by the courts, the Open Internet Order—which inserts the government directly into private dealings between ISPs and firms that generate or aggregate Internet content—will drag down investments in new networks and infrastructure and slow down innovation.

In a new paper for the Progressive Policy Institute, I estimate that ISP capital expenditures will fall between 5% and 12% per year relative to 2014 levels—based on experience in the late 1990s and early 2000s, the last time telecommunications companies were subject to public-utility rules.

This may not sound like much, but ISPs invested nearly $77 billion in 2014. A 5% drop means billions in network upgrades forgone. Thousands of jobs would also be lost—20 jobs for every million dollars of fiber investment, according to a paper I co-wrote with Jeffrey West in 2010. The losses won’t be limited to ISPs. Investment in new networks propels innovation everywhere, thanks to faster connections and greater capabilities.

From the late 1990s to 2005, telecommunications firms were required to offer a component of their DSL Internet service on a common-carrier basis. During that period their broadband investments grew at a significantly slower rate than those of cable competitors who were not subject to the utility regulations.

Continue reading at The Wall Street Journal.

Three Ways The FCC’s Open Internet Order Will Harm Innovation

The Federal Communication Commission’s 2015 Open Internet order threatens innovation in three distinct ways. First, by barring paid priority arrangements, the order undermines innovation in the nascent market for real-time applications like telemedicine and HD voice. Second, because sponsored-data plans (including zero-rating plans) may run afoul of its “general conduct” standard, the order could discourage procompetitive offerings that would subsidize Internet access for low income Americans. Third, by reclassifying Internet service providers (“ISPs”) as telecommunications providers under Title II of the 1934 Communications Act, the order will likely slow the flow of investment dollars by ISPs, which will adversely affect innovation.

This Policy Brief examines the potential harm to innovation in qualitative terms, and where possible, in quantitative terms. The major findings are as follows:

  • The nascent markets for certain real-time applications, including telemedicine, virtual reality, and HD voice, are expected to develop into billion dollar industries in the coming years. Although no application needs priority to function per se, there is a class of applications that need a certain level of quality of service that is not always consistently available on networks, especially across wireless networks that are subject to congestion. The ban on payments for priority arrangements could undermine certain collaborations among ISPs and websites/application providers (“content providers”), and thereby thwart a non-trivial portion of these applications from taking root, potentially costing the U.S. economy hundreds of millions of dollars annually.
  • By discouraging ISPs and content providers from pursuing different ways to subsidize Internet access for consumers—another form of collaboration—the order could deny the poorest Americans hundreds of millions in benefits annually. There are millions of Americans for whom broadband is just out of reach and who would otherwise be eligible for a subsidy in the form of a sponsored-data plan.
  • Subjecting telecommunications companies to Title II in the early 2000s caused their capital expenditures to decline by between five and thirteen percent under conservative assumptions. Exposing ISPs to the same regulatory risk could undermine core investment to the same degree. Based on U.S. Telecom’s estimated $76 billion in aggregate capex among U.S. ISPs in 2014, such a reduction would amount to between a $4 and $10 billion decline in investment at the core of the network.

 

Download “2015.05-Singer_Three-Ways-the-FCCs-Open-Internet-Order-Will-Harm-Innovation”

Carew for Republic 3.0: The Case for a Data-Driven FDA

The Food and Drug Administration (FDA) is fast finding itself at the center of the debate over healthcare regulation in the 21st century. At issue: to embrace the power of data-driven innovation, or to stand by the current regulatory paradigm. Fortunately, two major Congressional initiatives may be the push the FDA needs to see the data-driven economy as an opportunity instead of a risk.

Current rulemaking at the FDA follows a rubric laid out in 20th century legislation: safety and efficacy above all else. Medications and devices must be proven to be at least as good as what’s already available, through long and extensive clinical trials. All publicly available information about medications and devices must be deemed truthful and non-misleading, essentially sticking to only what’s “on-label.” The underlying assumption is that all drug and medical device companies are driven by profits, even if at the expense of public health.

The FDA’s current approach imposes strict requirements on drug and device companies that few other industries are subject to. Even the seemingly simple goal of sharing information is highly complicated under the current system. As PPI has documented, drug and device companies face a severely restricted ability to communicate information to the medical community and to consumers. So onerous are the requirements that many drug and device companies have more incentive to block the flow of information than to create it. Patients are hurt most because medical providers lack access to the best resources to treat them.

Such an outmoded approach to rulemaking will likely dampen future innovation and investment in healthcare. We live in a “sharing economy,” defined by the rise of the Internet, social media, and instant communication. Our unprecedented connectedness has facilitated an explosion of medical apps and real world observational data. Imagine how harnessing and sharing this information could help the 117 million Americans living with a chronic disease, or 20 million Americans with cancer, many of whom rely on unapproved uses of approved drugs for treatment.

Yet such potentially valuable information could not be shared under current rules. The FDA requires any sharing of information on unapproved uses of approved drugs be based on “adequate and well-controlled” clinical investigations, documented in peer-reviewed journals. Instead of embracing the power of data, the FDA seems to be scared of it.

Fortunately, two recent initiatives in Congress are addressing this critical need for rethinking the FDA’s approach to regulation. They are both ongoing efforts, driven by the opportunity to modernize our approach to healthcare rulemaking. Notably, both efforts explicitly address the outmoded approach to communications as a core part of 21st century healthcare regulation reform.

The House initiative, dubbed 21st Century Cures, has spent the last year collecting public comments and conducting analysis on how to use data-driven innovation to redesign healthcare. The most recent white paper notes, “as innovative companies know more about their products than anyone, precluding them from responsibly communicating about new scientific and medical developments does not promote the public health.”

In the Senate, Innovation for Healthier Americans similarly seeks to arm the FDA with tools to modernize healthcare regulation. It argues that restrictions on how drug and device companies can communicate could actually harm public health. The report notes that “in today’s online world[,] where doctors can look on the internet and find studies, it may be a disadvantage not to be able to discuss this information with the product developers who know the most about the project.”

The House and Senate efforts to modernize healthcare regulation give the FDA a rare opportunity to shine. By rethinking its approach to rulemaking in the 21st Century, the agency could define the future of U.S. healthcare design and delivery.

Instead of being viewed as bureaucratic, inefficient, opaque, and over-priced, the U.S. healthcare system could be innovative and dynamic. Customized nano-medicine, treatment delivered remotely, and apps that monitor chronic disease could be the envy of other healthcare systems.

Such a large-scale task may seem daunting for one regulatory agency, but the FDA could start small – say, with communications regulation. By allowing drug and medical device companies to better communicate with the medical community and consumers, a data-driven healthcare ecosystem could sprout and flourish. Each part of the diagnosis and treatment chain could work together, employing cost-saving techniques that improve patient outcomes.

The FDA opportunity should not be taken for granted. With the aid of Congress, it is possible for one agency to set the new gold standard for healthcare regulation, ensuring information is truthful and non-misleading, but also embracing the power of data to improve public well-being.

This is cross-posted from Republic 3.0.

Productivity Growth Continues to Plunge: Why A Growth Policy Is Necessary

Should progressives focus more on promoting growth, or fostering redistribution? The unfortunate fact is that we live in an era of weak productivity growth.  That means growth policies to encourage investment and innovation are essential for broad prosperity.

Based on today’s release from the BLS, ten-year productivity growth has now plunged to 1.4%, the lowest level since the 1980s (see chart below).  By comparison, ten-year productivity growth was 2.2% when Bill Clinton left office at the end of 2000, and hit a high of 3% at the end of 2005.

Productivity growth is the central force determining the size of the economic pie. Without productivity gains, living standards cannot show a sustainable rise.

 

Certainly real compensation growth is very weak as well. However, the difference between ten-year productivity growth and ten-year real compensation growth has also been narrowing.  It was 1.1 percentage points as of the first quarter of 2015, after peaking at 1.7 percentage points in 2011. That difference of 1.1 percentage points is only slightly above the 50-year average of 0.8 percentage points.

To put it a slight different way, real compensation growth has fallen from 1.5% in 2000 to 0.3% today, a catastrophic drop. However, two-thirds of that plunge can be attributed to a drop in productivity growth (from 2.2% to 1.4%), and only one-third to a widening of the gap between productivity and compensation growth. 

My conclusion: The sharp fall in productivity growth is the major reason why Americans feel so squeezed. Growth policies are key.

 

Governor Markell for The Atlantic: Americans Need Jobs, Not Populism

In an op-ed for The Atlantic, Governor Jack Markell (D-Del.) argues that instead of raging against a “rigged” system, Democrats should work together with business to build an economy that distributes its benefits more broadly.

The bottom line is that private enterprise creates the primary condition for reducing poverty and want: economic growth. Governments don’t create jobs; however, government has an ability and responsibility to create a nurturing environment where business leaders and entrepreneurs want to locate and expand. What that means is that government has an active role in creating an economic environment that creates middle class success and prosperity. …

Long-term success requires an active government that partners with business to ensure that the bounty of economic growth is shared broadly. Sharing this bounty is not about having a “bleeding heart.” It’s a matter of cold economic sense.  

I am hugely bullish about the future of the American economy because I believe in investing in people, engaging with the world and sharing broadly the bounty that economic growth will generate. Growing without sharing won’t get it done.  And neither will redistribution without growth. Americans really are in this together.

Read the piece in its entirety at The Atlantic.

Huffington Post: Science, Not Politics, Should Drive Trade and Regulatory Decisions

The Obama administration issued a stinging rebuke of the European Union’s decision this week to allow countries in Europe to “opt-out” of U.S. imports of genetically modified (GM) foods and feed. The U.S. Trade Representative said that such a rule “ignore[s] science-based safety and environmental determinations” that modifying crops in laboratories is no more harmful than traditional cross-breeding crops in the fields. Yet, in today’s hyper-politicized culture, the regulatory process in the United States is also often hijacked by special interest groups that subvert science in favor of their own emotional “narratives” that can be deeply misleading.

Modern advances in food science, both in how we produce and deliver food, have become key battlegrounds in the science versus fear-mongering debate. On the production side, GM foods can offer a much-needed path to feeding the world’s population. In the United States, the Food and Drug Administration (FDA) and the Environmental Protection Agency (EPA) have carefully studied GM foods and found them safe. The lack of any scientifically valid concerns, though, has not stopped special interest groups from seeking federal and state laws requiring that GM foods be labeled. Also, as much as the federal government may want to cast aspersions, the USDA has held up approval of modified salmon despite clear science that such fish are safe.

The politicization of the federal regulatory process takes on a whole new level, though, when one federal agency funds special interest studies that undermine another agency’s scientific conclusions key to federal regulations. This has been happening with bisphenol-A (BPA), which has been used since the 1960s to coat metal food cans to stop germs from growing in the cans that can be harmful to consumers. It has long been well understood that BPA molecules can migrate from the packaging to the food, and the FDA regulates BPA as an indirect food additive.

Here, the global community is united. The FDA, along with the European Food Safety Authority, Health Canada, and the World Health Organization, has studied BPA extensively and found its use in food containers to be safe. These groups have grounded their decisions in science. In short, they have found that humans rapidly metabolize BPA and that any BPA ingested is excreted in urine. Since 2000, though, National Institutes of Health (NIH) has funded $172 million in research of BPA. Many grants have gone to scientists supported by the same groups that oppose GM foods regardless of science — Greenpeace, Natural Resources Defense Council, and others. Not surprisingly, these scientists produce studies critical of BPA.

In response to alarmist reports, a subcommittee of the FDA’s science board recommended in 2008 that the agency re-examine the scientific basis for approving BPA. Last year, the FDA completed a four-year review of more than 300 scientific studies and once again found no evidence that BPA is harmful to humans when used in food containers and packaging. The broader scientific community found the studies critical of BPA to be fundamentally flawed. At this point, NIH must stop funding scientifically questionable studies or it will risk harming the American government’s credibility to be stewards over important scientific issues.

The tactic of trying to influence regulations by undermining science is not unique to food science or any political party or cause. Several years ago, reproductive rights groups rightly called foul when the FDA, under pressure from conservative activists, held up the Plan B over-the-counter pill despite science proving the drug’s safety and effectiveness. We saw what happened with the measles outbreak last year when libertarians across the political spectrum refused to follow regulations based on sound science that children be immunized from certain diseases, including the measles.

Progressives who believe in a strong regulatory regime should follow the U.S. Trade Representative’s sentiment and oppose the use of junk science to undermine the credibility of federal regulations. Since Vice President Gore’s Reinventing Government efforts in the 1990s, progressives have grabbed the pragmatic position in the debate over appropriate levels of government regulation. Federal agencies should get smart on an issue, develop targeted regulations, and effectively facilitate commerce while assuring appropriate protections.

As technology advancements continually push against our political and moral boundaries and regulatory agencies grow their footprints, it becomes increasingly important that science, not politics drive regulatory decisions. Especially when it comes to life’s basics needs, such as finding ways to make food more plentiful and less expensive, if scientific facts become undermined for political expediency, the most vulnerable people among us will lose.

This piece was cross-posted on The Huffington Post.

Copyright in the Digital Age: Key Economic Issues

The bounds of traditional copyright are being stretched and broken by technological change. The ease of digital copying, combined with new forms of creativity and production, including 3D printing, is transforming the copyright landscape at an accelerated pace.

Creators, companies, and governments need to think clearly about which goal or goals of copyright is the most important to them, and move towards a system that supports those goals. Speaking in the broadest terms, copyright establishes the right of an author or creator to control and benefit from his or her artistic endeavor. Yet what is society trying to achieve by granting such a right?

There is no better time to consider this fundamental question. The European Commission, under President Jean-Claude Juncker, has put a high priority on creating a Digital Single Market, which among other things would replace national copyright systems with a single EU system. Meanwhile, over the next several months, the European Parliament will be considering a draft report that offers up its own version of an EU-wide copyright system.

Simultaneously, American and European T-TIP negotiators are talking about how to harmonize intellectual property protection across the Atlantic, which could affect copyright as well. And national governments in Germany and Spain extended their copyright systems in recent years for the explicit—and ultimately unsuccessful—purpose of charging Google News and other sites a fee for running snippets of stories from national newspapers.

Download “2015.04-Mandel_Copyright-in-the-Digital-Age_Key-Economic-Issues.pdf”

Rotherham: No Congressional District Left Behind

In an op-ed today for U.S. News & World Report, Andrew Rotherham, cofounder and partner at Bellwether Education Partners, intriguingly argues that the best school reform idea is to fix the gerrymandering of legislative districts:

One of the interesting things about my job is that wealthy people ask me for ideas about how best to use their resources to improve America’s schools. There are plenty of important issues demanding attention: overhauling the sorry state of teacher preparation and teacher policy (I wrote an entire guidebook about that), giving low-income Americans more educational choice and improving educational finance are three obvious ones. But, to the consternation of colleagues in the education world, I don’t first suggest those or other specific education issues. Instead, I urge donors to support efforts to reform congressional redistricting. We won’t be able to genuinely improve our schools (or address a host of other issues) until we create legislative districts based on geography rather than gerrymandering.

Read the op-ed in its entirety at U.S. News & World Report. 

Press Release: PPI Statement On FCC Open Internet Order Release

PPI Statement On FCC Open Internet Order Release

Time for Congress to Act

WASHINGTON—Dr. Michael Mandel, Chief Economic Strategist of the Progressive Policy Institute (PPI), today released the following statement after the FCC published the Open Internet order:

“Today, the FCC released the 400 page text of its Open Internet order. From the economic perspective, it’s distressing that the Commission has decided to impose this many new regulations on a technologically dynamic and innovative sector that has been propelling growth. From the political perspective, it’s equally distressing that Americans are only seeing this order after the Commission approved it, showing a lack of transparency. And from the common sense perspective, the FCC’s promise to forbear from rate regulation is total nonsense, given all the other rules the Commission has pledged to enforce in its order.

“We all believe that having an open internet is important, but the FCC has picked the wrong approach. We urge Congress to pass a set of open internet rules that don’t take us back in time.”

###

Read PPI’s previous work on this issue:

The Truth Behind The FCC’s “Fact Sheet” by Hal Singer
The Best Path Forward on Net Neutrality by Hal Singer and Bob Litan
Outdated Regulations Will Make Consumers Pay More for Broadband by Hal Singer and Bob Litan
One last chance to save the Internet by Ev Ehrlich
The Wrong Way to Enact The Wrong Policy — The FCC’s No Good, Very Bad Day by Ev Ehrlich