Has The FCC Made Its Case To Restrict Certain Bidders In The Broadcast-Spectrum Incentive Auction?

FCC Chairman Tom Wheeler recently signaled that his agency is considering certain bidding restrictions for the upcoming broadcast-spectrum auction that are specifically targeted at the two largest nationwide providers. At some ill-defined point in the auction, the restrictions reportedly would be imposed on any bidder that has more than one third of the available “low-band” spectrum in a market.

And guess who holds more than one-third of “low-band” spectrum in any particular market? AT&T T +0.79% and Verizon. As a result of the proposed restrictions, between 40 and 50 percent of the spectrum blocks in a given band plan would be off limits for the two mobile broadband companies best positioned to battle cable modem providers. Is this a good thing?

The best policy justification for bidding restrictions in an auction is the presence of monopoly power. The theory is that a monopolist is willing to pay more to cement its position than a rival is willing to pay to displace the monopolist. Although the auction might cause the monopolist to surrender a good portion of its profits to the auctioneer, at the end of the day, consumers are still beholden to monopoly prices. (The second best justification for a restriction is that there is something special about “low-band” spectrum—without it, smaller carriers cannot compete effectively. I have rebutted this justification here.)

A review of the evidence suggests that no wireless carrier is exercising monopoly power—that is, setting prices above competitive levels or restricting output.

Recent price cuts in response to T-Mobile’s “Uncarrier” initiatives and no-contract plans have put downward pressure on wireless margins. In February, AT&T cut its Mobile Share shared data plan prices (with 10 GB of data) to $160 per month for four phone lines; in response, Verizon matched that pricing in April. In March, AT&T alsocut the price of its smaller shared data plan (with 2 GB of data) by $15 to $65 per month for one phone line. These pricing episodes are hardly consistent with the notion of monopoly power.

Perhaps these recent price cuts mask a longer trend of rising prices? Not so. According to the FCC’s 2013 Wireless Competition report, competition is robust:

  • Monthly average revenue per unit (“ARPU”) for wireless service declined from $48.04 in 2006 to $46.63 in 2012; wireless voice revenue per minute has declined from $0.06 to $0.05 over the same period.
  • Voice revenue per minute in the United States ($0.033) is less than one third of the European average.
  • U.S. mobile subscribers talked an average of 945 minutes per month on their mobile phones in 2011, compared with 134 minutes in Japan and 170 minutes in Western Europe.
  • And the United States has the second least concentrated market structure in a Bank of America BAC -0.66%survey of ten countries, behind only the United Kingdom.

After 408 pages of excruciating detail on the state of wireless competition, the FCC is hard-pressed to identify any data consistent with monopoly power. And without a showing of monopoly power, the social benefits of these bidder restrictions are likely insignificant.

On the other hand, unwarranted restrictions can inflict significant losses on society in three important ways.

First, assuming AT&T even shows up to the auction, prices on the restricted blocks will be significantly less than the prices in the non-restricted blocks. Although there is some chance that prices in the non-restricted blocks could be higher (due to the artificial scarcity created by the restrictions), the FCC is exposing itself and the taxpayer to a considerable risk of diminished auction revenues—revenues needed to fund deficit reduction, build-out of an interoperable public-safety network, and other priorities enumerated in The Middle Class Tax Relief and Job Creation Act of 2012. And auction revenues are needed to compensate broadcasters interested in giving up their spectrum. The amount of the broadcast spectrum that will be available for reallocation to wireless broadband will depend critically on the broadcasters’ perception of auction prices; the law of supply dictates that there will be less spectrum available for sale the lower the expected price.

Second, by setting aside valuable spectrum, the FCC is creating an attractive opportunity for firms to engage in regulatory arbitrage. Set asides will encourage firms not interested in building networks but instead buying spectrum to flip it later for a windfall. History has shown that set-aside spectrum sits fallow for years, staving off the sort of broadband deployment that Congress desires. Competition for the arbitrage opportunity leads to wasteful “rent seeking” activity, which represents another loss. Allowing the carrier that will ultimately deploy the spectrum to purchase it immediately and directly (rather than through a wasteful and superfluous middleman) is clearly the more efficient choice.

Third, efficiency dictates that spectrum goes to the wireless carriers that value it the most. If an incumbent carrier facing a spectrum crunch is willing to pay more for the next chunk of available spectrum than an entrant, assigning the spectrum to the entrant represents a misallocation of society’s resources. Relatedly, a significant challenge facing the FCC is injecting competition into the broadband marketplace. According to the FCC’s most recent data, a full 19 percent of U.S. homes were beholden to a single provider of broadband service (including wireless operators) capable of delivering download speeds of 10 Mbps. Wireless broadband could impose significant discipline on cable operators in these pockets if the FCC opens up the spectrum spigot to all firms, as opposed to parsing out thin slices to smaller companies.

In sum, the FCC has failed to meet its evidentiary burden for the use of bidding restrictions as currently proposed for the upcoming incentive auctions. There is no compelling evidence of monopoly power in the wireless sector. And there has been no attempt to prove that smaller carriers need access to “low-band” spectrum to compete effectively against their larger competitors. Until those burdens are met, the FCC should let the auction blocks fall where they may.

This article was originally published in Forbes, please find the original article on their website here.

FCC’s Wheeler Plays Hand Courts Dealt Him

FCC Chairman Tom Wheeler’s determination that he can allow Internet Service Providers to offer differentiated service options to websites and content providers – an ability that “net neutrality” advocates regard as decidedly non-neutral – surprised many people.  But perhaps it shouldn’t have.

Wheeler’s announcement resolved a mystery created by a recent court decision that the FCC lacked the power to regulate the way broadband providers manage their networks.  Specifically, in a case brought by Verizon, the Court denied Wheeler and the FCC authority to specify that there must be only one tier of service on the Internet, the essence of the neutrality program.  But the Court also recognized his authority to regulate broadband as part of the FCC’s larger obligation to promote the Internet.

Predicting that it was time for Wheeler to lead the FCC past the neutrality debate and modernize the regulation of the Internet was not necessarily an act of clairvoyance – it was simply the product of a level-headed reading of the situation.  I participated in a Progressive Policy Institute forum last month in which a variety of experts, including some advocates of net neutrality, came to a surprising degree of consensus about Chairman Wheeler’s response to the U.S. District Court’s decision. Basically, we thought he had three options for regulating the Internet, and two of them weren’t going to work.

The first, and most radical, would be to declare that the Internet was really “just a telephone network” and therefore subject to the most intrusive regulations the FCC can muster.  That would have been a radical step from several perspectives.  First, and most obviously, saying that the Internet is really “just like” the Ma Bell phone system is like saying a Maserati is “just like” a Model T and should be subject to the same speed limits. But it should also be recalled (particularly by those who think the Internet should be a state-owned “public utility”) that the FCC’s regulation of phones was premised on a sanctioned monopoly in which companies invested without significant risk.  In contrast, the modern Internet was built by over a trillion at-risk, private dollars pouring into competing technological platforms.  On these and a variety of other bases, “reclassifying” Internet as telephony would have a very hard time passing the laugh test in court.

(Nor, in fact, might that resolve the problem – read the original 1934 Telecommunications act and you’ll be surprised  to see that it’s quite comfortable with differentiated services, so long as they’re made available to all.  Which is, of course, exactly Wheeler’s position eighty years later.)

A second option was to go to the Congress for explicit legislative authority to regulate conduct on the Internet.   I’m not a professional political analyst but…good luck with that.

To be fair, there may be an emerging middle ground in the Congress for an Internet policy perspective that might not be far from where Wheeler is today; in the past few weeks, for example, over 70 House Democrats signed a letter calling for open and unrestricted spectrum auctions, a sharp departure from the view held by some of their colleagues that the winners of those auctions should be prejudged by the FCC.  That’s a vote of confidence in competition.   But some in Congress advocate not just for net neutrality, but for extended public ownership and control of the Internet, while many on the other side doubt that we need any regulatory protections whatsoever, let alone an effort to extend the Internet’s role in such areas as health and education, or addressing the “digital divide.”  So there’s no obvious consensus on any issues of Internet regulation, let alone imposing neutrality through regulation.

Which leaves Wheeler with a third option – to play the hand the Court dealt him.  And he appears to be doing so smartly, by allowing ISPs to offer websites and content providers (often called “edge providers”) prioritization for those services that want it (perhaps high-definition video conferencing or real-time, interactive services such as health, teaching, or gaming and entertainment) while letting the rest of Internet traffic – your e-mail sharing a video of a cat playing the xylophone – to move as it always has, unabated.  He also made it clear that allowing some content to move on “express lane” terms is not the same as blocking other content, and that he would reserve the right to make sure that any prioritization deals were “commercially reasonable.”  Hopefully, this will mean a case-by-case review of actual transactions that have inflicted actual harm on an actual someone, not making judgments that reflect nothing more than the sensibilities of bureaucrats.  In fact, the PPI panel was also in broad agreement on this point – that it was time to embrace a new regulatory perspective that allowed “experimentation” in the way service is provided and that adjudicated contentious issues after the fact and after demonstrated harm has occurred, rather than through blanket, a priori, regulatory pre-emptions.

Wheeler seems to have embraced this approach. He’s getting us off “Square Zero” by recognizing that tiered service has its place, and putting to rest the neutrality debate that my colleague Hal Singer said last month “is sucking all the oxygen out of the room.”  In that sense, Wheeler’s most important accomplishment in announcing his view might be to make clear that opponents have mischaracterized “prioritization” as being the same as “blocking competing content,” “permitted innovation,” threatening the “open Internet,” and other slogans.

These catchphrases are commonly accepted by many media outlets, but now have been put to shame, and hopefully, rest.  Prioritization doesn’t change the reality that everyone who wants to bring content to the Internet can do so without impediment; in fact, the ISPs desperately want them to do so, since that’s the value proposition of what they’re selling.  Making that clear only ratifies what the market has already decided.  Nor does it mean that the ISPs will decide who can innovate and who can’t any more than the post office decides who can send a letter and who can’t when it offers First Class Mail and then Priority Express.  Wheeler has, to his credit, made clear what the real issues are.

And he appears to be disregarding the complaint that prioritization would be unfair to “the little guy.”  If that were the standard, every sector of the economy would come under regulation.  The little guy has to pony up to put his product on supermarket shelves or to buy a $5 million Super Bowl spot.  The Internet will remain a more competitive sector than virtually any other in the economy.  In fact, the Internet is already tilted against the small, start-up website; Big Websites already have speed advantages over the “little guy” due to pervasive caching of content.  Prioritization may make it easier for the “little guy” to catch up.

Let me make two predictions.  First, “prioritization” will change the Internet less than many think.  Network speeds in the US are increasing rapidly, and we have gone from 22nd in the world to 8th in a very short time (once the courts removed regulatory impediments to sustained investment).  And, we are one of the few nations on Earth that have competing platforms bringing broadband to the consumer – phone companies, cable companies, wireless (where we lead the world), and satellite, as opposed to the nations that staked their bets on a national phone company and are coming to regret it.  So our prospects for leadership are excellent.  I’m not sure how many sites will jump at the chance to improve their stream given how good the system as a whole is becoming.

And the second prediction is that Wheeler has now broken the ice and will lead the FCC into a series of decisions in which a ‘sensible center” finally holds sway. This would include accelerated auctions of spectrum now held by the government and broadcasters, open auctions for new spectrum, allowing the market for “peering” and other backbone transactions to evolve as any other competitive market would, and – one hopes – a revitalized National Broadband Plan to realize the Internet’s social potential.  In all of these cases, the FCC Chairman can reproduce the successful strategy he employed to move the “neutrality” debate forward – seizing the only realistic option in front of him and running with it.

Everett Ehrlich is the president of ESC Company, and a senior fellow at the Progressive Policy Institute.

Bridging The Data Gap: How Digital Innovation Can Drive Growth and Jobs

Seldom has the world stood poised before economic changes destined to bring as much palpable improvement to people’s lives and desirable social transformation as “big data.”

Breathless accounts abound of the huge amounts of data that citizens, consumers and  governments now generate on a daily basis in studies ranging from the French Prime Minister’s Commissariat général à la stratégie et à la prospective study on Analyse des big data: Quels usages, quels défis to Viktor Mayer-Schönberger and Kenneth Cukier’s seminal Big Data: A Revolution That Will Transform How We Live, Work and Think.

But the larger revolution will come not from the exabytes of data being generated on a daily basis, but through the vast advances in analytics that will help us convert this information into better lives, and better societies. Already, many companies are using the new information to offer more tailored products and services to customers; consumers are receiving more effective healthcare; clever administrations are cutting pollution and commuter transit times; people of all types are being entertained and educated in fascinating new ways; and entrepreneurs who seize the opportunity are helping raise North America and Europe from the longest economic recession since statistic-taking began.

Download the full report here.

Michael Mandel and Paul Hofheinz presented their paper today at the PPI & Lisbon Council joint event: New Engines of Growth: Driving Innovation and Trade in Data

Data, Trade and Growth

We show in this paper that the architecture of the Internet dictates that current trade statistics significantly underestimate the magnitude and growth of cross-border data flows. As a result, the contributions of cross-border data flows to global growth and to small businesses are being significantly underestimated. This suggests that trade and tax policy should place more emphasis on maintaining cross-border data flows. Moreover, policies that discourage cross-border data flows, such as data localization and high tax rates on cross-border data, should be avoided if possible. Statistical agencies should explore adding data as a separate trade category, along with goods and services.

INTRODUCTION
The architecture of the Internet is designed as a “network of networks.” As such, one of its key attributes is making the passage of data from one network to another easy. So, when a user sends an email, views a video, or downloads a file from a website, the data may pass through a large number of different networks on the way from its origin to its destination, with the routing virtually transparent to the user.

This architecture has proven to be extremely flexible and powerful, both nationally and globally. Individuals, small businesses, and corporations with Internet access can easily access data of all sorts from around the world. Similarly, companies can efficiently and cheaply provide services such as email and web search on a global basis, in many cases without charge.

One sign of the Internet’s global success is the rapid growth of cross-border data flows. Cross-border data flows are growing far faster than conventionally measured trade in goods and services. According to TeleGeography, a consulting firm that keeps track of international data flows, demand for international bandwidth increased at a compound annual rate of 49% between 2008 and 2012.1 By comparison, the overall volume of global trade in goods and services, adjusted for inflation, rose at an average rate of 2.4% over the same period.

Continue reading and download the full report.

Mandel Speaks at All Things Connected Washington Post event

Michael Mandel, chief economic strategist at the Progressive Policy Institute, described the Internet of Things as the “extension of the Internet to the physical world. He told the audience at Washington Post Live’s All Things Connected forum, “The Internet has transformed digital industries, while the Internet of Things will transform physical industries.”

Thanks To Bill Clinton, We Don’t Regulate The Internet Like A Public Utility

A DC federal court struck down the FCC’s “net neutrality” regulations earlier this year, but did nothing to resolve an ongoing debate over whether or how the government should regulate the Internet.  At the heart of the controversy lies a central question – should we regulate the Internet as we did the old telephone network and other so-called “common carrier”?

In a paper to be released this week by the Progressive Policy Institute, I examine the past two decades’ experience to shed light on this question.  And the answer that keeps coming up is that proposals for strict utility-style regulation of the Internet have two things in common.  First, they are based on the presence of a “natural monopoly” for broadband that simply does not exist.  And second, where they have been tried, utility-style rules have been the greatest single obstacle to investment in broadband infrastructure.

From the earliest days of the Bell monopoly, our telephone system was built around an explicit bargain.  In exchange for a guaranteed and low-risk profit, the Bell system would provide quality, reliable phone service to the nation.  This bargain was deemed necessary because it was assumed that phone service was a “natural monopoly” where the costs of infrastructure were so high that competition wasn’t possible.  But by the 1990s, those assumptions had completely broken down.  Microwaves and coaxial cable could carry phone calls, phone lines could deliver video, and an “information superhighway” loomed in the future.

The Clinton Administration’s Telecommunications Act of 1996 sorted this mess out and launched the age of modern Internet policy – trusting market forces and technological innovation to the maximum extent.  It was an act of incredible political maturity.  Its authors knew something remarkable was about to happen and that government could best serve it by stepping back and letting private investment happen.

So the 1996 Act drew a line – the old phone system would remain regulated as a “common carrier,” but the emerging new world of “information services” would be allowed to develop on its own free from utility-style requirements such as government oversight of prices, forced sharing of infrastructure with competitors, or rigid traffic management rules.  As a result, we have seen over $1.2 trillion in investment since the 1996 Act, and the innovation, growth and new services the Act’s framers imagined.

Further light is shed by the treatment of the incumbent phone companies.  As a transitional measure, the Act preserved the utility model for the telcos, which were forced to share any infrastructure they built with all comers at a government supervised price (well below its long-term cost).  That requirement smothered investment since no one would build new infrastructure if they had to share it with competitors at a loss.  The result was initial stagnation in DSL broadband.  And when that requirement was later –overturned, investment followed there as well – more evidence of the dangers of the utility model in this space.

Europe still relies on these utility-style regulations and has used its state post and phone monopolies to build out broadband.  The results haven’t been pretty.  Per capital investment in broadband in the U.S. is nearly double that of Europe.

As a result, our major European trading partners are anchored near the bottom of the Internet speed charts – Germany is 27th in the world on the most recent Akamai speed rankings, France is 34th, Italy 48th.  The US by contrast is 8th, trailing small, dense, and highly urbanized places like Japan, South Korea, and Hong Kong, in contrast to the U.S.’ sprawling geography.  No one wonder EU Digital Policy chief Neelie Kroes says Europe “needs to catch up” in broadband.

The “natural monopoly” pro-regulation arguments depend on clearly does not exist.  America now has three different broadband technologies fully deployed and competing for customers (cable, telco, and 4G wireless).  The U.S. is near the top of global rankings in both high-end service, with 85 percent of households served by networks capable of 100 mpbs or more and the most affordable entry-level wired broadband of any nation in the OECD.  Imagine what would ensue if we were to change course and regulate the Internet as a monopoly utility?  Which of the three technologies would regulators adopt?  How would we ensure continued investment?

The Internet is undeniably incredibly important.  But that importance doesn’t mean that we should treat it as a public utility.  Bringing back the days of Ma Bell won’t fulfill broadband’s remarkable promise.

This article was originally posted by Forbes.  You can read the original post on their website here.

A Brief History of Internet Regulation

EXECUTIVE SUMMARY
Proposals to regulate the Internet are often presented as “new” solutions to deal with modern problems, but the most significant of these proposals, such as “network neutrality” and common carrier rules on unbundling and interconnection, are actually vestiges of long-outmoded ways of thinking about telecommunications policy. This paper explores the relevant regulatory history, offering critical context to today’s Internet policy debates.

From the early days of the AT&T monopoly well into the 1990s, regulators, the courts and the Congress engaged in a lengthy effort to protect consumers and ultimately bring competition into the markets for local and long-distance telephone service. This included strict “common carrier” utility regulations and mandatory interconnection requirements and ultimately the 1984 Modified Final Judgment, which forced the breakup of AT&T into regional Baby Bells. From the beginning of “community antenna TV” through the 1990s, a parallel but more limited effort was made to regulate the nascent cable industry. While these regulations had some success, technological change quickly outstripped them—both in the telephone business and the emerging field of high-speed data—and a bipartisan consensus formed in the early 1990s that additional steps were needed to promote competition in all these arenas.

The result was the Telecommunications Act of 1996, watershed legislation that marked the end of the telephone age and the beginning of the Internet age from a policy perspective. The Act embraced and codified the FCC’s distinction between traditional telephony/telecommunications services and the emerging world of information services, with strict common carrier rules limited to the former. On the telephone side, this meant a stifling regime of mandatory “unbundling” and rigid price controls, while giving the private sector more latitude to innovate and invest on the “information services” side. The 1996 Act may not have specifically contemplated the rise of the broadband Internet (the idea of an “information superhighway” was in the air, but the exact form it would take was still unclear as a matter of both technology and policy), but by protecting information services from the common carrier framework, the Act set the stage for the dynamic growth we have seen in American broadband.

The result was a boom in cable broadband investment that telecommunications providers attempted to counter by offering DSL services. But any new DSL capability they constructed had to be leased out to competitors at below market prices under the unbundling regime, which limited their efforts. When fiber and DSL were relieved of their unbundling obligation in the early 2000s, however, capital poured in and these services flourished as fixed-broadband competitors to cable. In fact, that competition drew a competitive response from cable, in turn leading to a virtuous cycle of improvement and enhancement resulting in the United States ascending to the upper reaches of the International broadband rankings.

This background sheds important light on current calls to impose “new” regulations on broadband either through “network neutrality” rules or by reclassifying it as a “telecommunications service” subject to common carrier obligations. While advocates suggest otherwise, these proposals are clearly not new, but would represent a return to the dated—and in the view of this paper failed—approach that the bipartisan 1996 Act was designed to sweep away. Most of these proposals for network micromanagement, forced sharing of investments, and government influence on pricing have been associated with low investment and innovation. These rules may have made sense when the problem was how to protect consumers in the days of the sanctioned Ma Bell monopoly, but the business and consumer landscape is dramatically different today in almost every regard.

Ultimately, three key lessons emerge from this policy review. First, information services and telecommunications services really are different, and broadband has flourished as an information service free from ill-fitting and stifling common carrier constraints. Second, investment and capital flow to where regulation (or the absence thereof) encourages them to flow. And third, technology, business models, and consumer behaviors change and, as they change, the meaning and effect of different regulatory proposals change as well.

Download the entire report.

The Hill: Been there, done that on broadband

A DC federal court struck down the FCC’s “net neutrality” regulations earlier this year, but did nothing to resolve an ongoing debate over whether or how the government should regulate the Internet.  At the heart of the controversy lies a central question – should we regulate the Internet as we did the old telephone network and other so-called “common carriers”?

In a paper to be released this week by the Progressive Policy Institute, I examine the past two decades’ experience to shed light on this question.  And the answer that keeps coming up is that proposals for strict utility-style regulation of the Internet have two things in common.  First, they are based on the presence of a “natural monopoly” for broadband that simply does not exist.  And second, where they have been tried, utility-style rules have been the greatest single obstacle to investment in broadband infrastructure.

From the earliest days of the Bell monopoly, our telephone system was built around an explicit bargain.  In exchange for a guaranteed and low-risk profit, the Bell system would provide quality, reliable phone service to the nation.  This bargain was deemed necessary because it was assumed that phone service was a “natural monopoly” where the costs of infrastructure were so high that competition wasn’t possible.  But by the 1990s, those assumptions had completely broken down.  Microwaves and coaxial cable could carry phone calls, phone lines could deliver video, and an “information superhighway” loomed in the future.

The Clinton administration’s Telecommunications Act of 1996 sorted this mess out and launched the age of modern Internet policy – trusting market forces and technological innovation to the maximum extent.  It was an act of incredible political maturity.  Its authors knew something remarkable was about to happen and that government could best serve it by stepping back and letting private investment happen.

Continue reading at the Hill.

A Merger of Necessity

The proposed merger between Comcast and Time Warner highlights the vast gap between the imagined world the broadband industry’s critics and the real world in which these companies must compete.

For years, the critics have advocated forcing companies such as Verizon and Comcast to share their infrastructure with their competitors or mandating that the broadband market only offer one level of service. Their argument is that America’s broadband is gripped by a “cable/telco duopoly” that uses its market power to slow innovation and gouge the consumer. And the Comcast-Time Warner combination is their new monster under the bed.

In fact, the substance of these criticisms is simply wrong. The latest rankings from Akamai show the U.S. eighth and rising in global Internet connection speeds, and a new report from the International Telecommunications Union depicts U.S. wireline broadband as being the most affordable among our trading partners as well.

But even more dissonant are the data on profitability. In a new study set to be released next week by the Progressive Policy Institute, I examine the rates of profit of two subgroups of the Fortune 500 — companies that provide the Internet (from ATT and Verizon down to Level 3 and Frontier) versus companies who reside on it (from Apple and Microsoft to Facebook and Yahoo). The (average weighted) rate of profit on sales for the “providers” is 3.7 percent, versus 24.4 percent for the “residers.” Calculated on assets, the rates are 2.1 percent versus 17.7 percent, respectively.

So the companies that use the broadband Internet are making six to eight times the margins of the allegedly monopolistic companies who provide it — the exact opposite of what you’d see if the price gouging accusation was real.

The problem is that advocates for regulation simply don’t get the competitive dynamics of the broadband industry. And if we don’t have that understanding, we can’t understand the Comcast/Time Warner merger.

In the rotary phone world, “connectivity” — dial tone — created all the system’s value, once you had a phone. But the Internet is different. Rather than a “dumb” signal, Internet connectivity is part of a multi-part parlay with devices, services, applications and other components that deliver value to the consumer. All of these components compete for a larger slice of the integrated customer value pie.

Consider the iPhone. Its vaunted voice recognition technology, for example, has been around for a long time. It’s only been offered in phones now because mobile broadband is powerful enough to let the cloud deliver the service to the user in real time.

So the innovation that makes the iPhone and its applications more valuable to consumers was really the faster speeds offered by mobile service providers. And this is the competitive reality today. The device, website, app and content companies are capturing most of the benefits created by the connectivity “providers,” hence their lusher margins. Yet the providers must continually innovate and improve their service so their customers will bring those devices and applications to the providers’ platforms. In essence, the “providers” are caught in a loop in which they innovate, the downstream device and service providers capture the value created by those innovations, and the providers must then innovate all over again. No wonder the residers make money far outstripping providers.

And it’s not just the mobile market. Watch bandwidth-munching UltraHD TV — so-called “K4” — as it enters the consumer market, now that there’s enough bandwidth to support it. Will the set-makers make the margin, or the broadband providers who made the new sets possible?

So, unlike their caricature as duopolists, provider market power is extremely limited. They are essentially high fixed-cost systems that must continually attract new customers to spread their fixed costs over a larger base, even as other companies garner most of the benefits of their innovation.

And they have little power over content as well. If Comcast were to block, say, YouTube, would you keep their service, or switch to Verizon, ATT, Sprint, Dish, DirectTV or any of several other provides to get what you want to see? And which is the danger — that Comcast will charge you to reach YouTube, or that YouTube will one day charge Comcast to be on its system? In the real world, content, not connectivity, has the muscle.

And this is the backdrop against which we should see the Comcast/Time Warner merger. Comcast’s offerings will immediately improve the service Time Warner’s customers receive. And that will make the combined company a better competitor and innovator in the competitive cage match in which the providers of connectivity, devices, apps, services, content fight for a share of the value the broadband world creates. Rather than a denial of competition, the proposed merger demonstrates that active, aggressive competition is underway in broadband, and Comcast is girding itself for that content. The right policy is to let them do so.

This article was originally posted in The Baltimore Sun, read it on their website here.

 

Private Sector Investment and Innovation Must Be Encouraged

PPI has consistently argued that in most cases, pro-investment and pro-innovation policies will give the best long-term results for raising the living standard of ordinary Americans, boosting domestic production, creating jobs and raising real wages. Today, private investment and nonresidential investment in equipment and structures as a share of GDP is still significantly below pre-recession levels. As a result, we believe that government policymakers should pay close attention to promoting investment, innovation, and productivity.

These principles should influence the government approach to mergers such as proposed combination of Comcast and Time Warner Cable. Antitrust authorities should assess the merger on the basis of how it might affect investment, innovation, and productivity as well as competition grounds. In addition, PPI research, based on official government statistics, shows that investment in the tech/info sector is creating job opportunities for blacks and Hispanics.  These benefits should be part of the merger assessment process as well. (See https://www.www.progressivepolicy.org/2014/01/can-tech-help-inner-city-poverty/).

Has the FCC Chairman Solved the Net Neutrality Quagmire?

Up until the D.C. Circuit’s recent decision in Verizon v. FCC, extreme voices of the political spectrum dominated the “net neutrality” debate. The far left pressed for extensive government interference in the dealings between broadband providers and websites. And the far right questioned the FCC’s authority and need to regulate Internet services. The D.C. Circuit truncated both sides of the distribution of voices; by rejecting the left’s draconian methods, and by affirming the FCC’s authority and basis to regulate Internet services, the Court paved the way for a reasonable compromise. To satisfy the Court, however, the new regulatory regime must leave “substantial room for individualized bargaining and discrimination in terms” of special-delivery arrangements; else it would amount to an outdated mode of regulation called “common carriage.”

The solution, which Bob HahnBob Litan and I have been peddling for a few years, involves the FCC making case-by-case decisions or “adjudication” in administrative law. In a nutshell, the FCC would permit special-delivery arrangements between broadband providers and websites, but the agency would police abuses of that newfound discretion through a complaint process. Adjudication would ensure consumer protections on the Internet, and it will bolster the incentives of both websites and broadband providers to invest at the edges and the core of the network, respectively, generating even more benefits to consumers.

Fortunately, the two Bobs and I are no longer the sole defenders of adjudication. In the two weeks since the decision, adjudication has been endorsed by Professor Kevin Werbach in the Atlantic, Professor John Blevins in the Washington Post, and Professor Stuart Benjamin in his blog post. Most importantly, the concept was floated by FCC Chairman Tom Wheeler in a recent speech in Silicon Valley, and made even more explicit in his speech at the State of the Net conference this week. From an economic perspective, adjudications are the most efficient and most equitable solution available to the Commission. Continue reading “Has the FCC Chairman Solved the Net Neutrality Quagmire?”

Providence Journal: How the U.S. government helped give us the Internet

The Internet has become so integral to our everyday lives that it is easy to forget how young it is. Mosaic, the first graphical web browser, came out in 1993. Since then, the Internet’s phenomenal growth has transformed the way billions of people around the world communicate, learn, work, trade, campaign, mate, protest, plot and form communities.

All this seems inevitable in retrospect, but it wasn’t. In the 1990s, U.S. policymakers faced critical choices about who should build the Internet, how it should be governed, and to what extent it should be regulated and taxed. For the most part, they chose wisely to open a regulated telecommunications market to competition, stimulate private investment in broadband and digital technologies, and democratize access to the Internet.

The story of how scientists, engineers, entrepreneurs and venture capitalists created the technical basis for the digital revolution is familiar. Much less is said about the visionary policymakers who created the legal and regulatory framework that enabled the Internet’s exponential growth.

Controversial at the time, their decisions drew fire from both sides of the aisle. Conservatives complained that Washington’s efforts to expand access to the Internet constituted — you guessed it — a “war on the Web.” Liberals demanded a more aggressive regulatory stance to prevent predatory companies from dominating the digital economy. Also stoking fears of monopolies and demanding regulation were incumbent businesses threatened by new technologies and start-ups.

In steering a pragmatic course between ideological poles, the “digital policy pioneers” showed not only foresight, but a quality even rarer in Washington: humility. Instead of trying to direct the Internet’s evolution, they relied on competition to set prices and they let consumers decide which devices, technologies and services would thrive in the digital marketplace.

Key digital policy milestones include the Clinton administration’s 1993 blueprint for building an “information superhighway,” the landmark 1996 Telecommunications Act, and the 1998 framework for global e-commerce developed by White House adviser Ira Magaziner.

The most important pioneers, however, were President Clinton and Vice President Al Gore. They recognized before most that the new digital technologies were creating something fundamentally different, not just a high-tech version of the old telephone system.

As “Clinton’s idea mill,” PPI played a strong supporting role. During the 1990s, we launched a New Economy Task Force. The Task Force brought together high tech entrepreneurs from Silicon Valley and elsewhere with leading members of Congress to hammer out new “Rules of the Road” for nurturing the nascent digital economy.

The policies that took root in the ’90s were refined and strengthened during the Bush and Obama administrations. Digital policy, in fact, remains a rare, bipartisan exception to the zero-sum logic of polarization that has paralyzed our national government.

Our current leaders must work hard to maintain that consensus. After all, the vibrant innovation ecosystem that has grown up around the Internet is a prime catalyst for jobs and economic growth. According to research by Michael Mandel, PPI’s chief economic strategist, demand for apps has led to 750,000 new jobs since the iPhone was introduced in 2007.

The broadband Internet also is a powerful magnet for private investment. In 2013, telecom and tech companies topped PPI’s ranking of the top 25 companies that invested the most in the U.S. economy. And America is moving at warp speed toward the “Internet of Everything,” which promises to spread the productivity-raising potential of digital technology across the entire economy.

Nor are the Internet’s benefits exclusively economic. Ev Erhlich, in a report for PPI outlining a “Progressive Broadband Agenda,” stresses ways the Internet can help to reinvent “social” sectors like education, health care, as well as the delivery of government services.

In short, U.S. policymakers need to continue to get digital policy right, because our country’s prosperity and social progress depend upon it. That’s especially true now, as the open and decentralized Internet faces a new array of challenges. These include the backlash to the National Security Agency revelations; Europe’s determination to impose strict “data protection” rules that could deter transatlantic data trade; nationalist demands for “data localization,” which would impede cross-border data flows; the Internet’s use by criminals and terrorists for sinister purposes; and, a growing push by authoritarian countries, especially Russia and China, to subject the Internet to international regulation.

As we look back over the last two decades, it’s clear that the free and open Internet isn’t just a technological marvel. It’s also a major political achievement — and one that looks all the more impressive when juxtaposed to the partisan paralysis that afflicts Washington now. And it’s an achievement that needs defending today.

 

This piece was originally published by the Providence Journal, you can read it on their website here.

PPI Economic Experts Weigh In On ‘Net Neutrality’ Court Decision

WASHINGTON, D.C. — Progressive Policy Institute senior fellows Hal J. Singer and Ev Ehrlich today released the following statements after a U.S. Court of Appeals struck down rules by the Federal Communications Commission (FCC) prohibiting Internet service providers from restricting user access to legal Web content:

Hal J. Singer is a senior fellow at PPI:

In its decision to vacate the anti-discrimination and anti-blocking rules of the Open Internet Order, the D.C. Circuit correctly recognized that the FCC used a heavy-handed, ‘common carrier’ approach to regulating Internet access providers in their dealings with websites—despite the Commission’s classification of Internet access providers in a manner that exempts them from treatment as common carriers.

“By effectively proscribing pay-for-priority deals and thereby compelling Internet service providers to provide enhanced services to websites at no cost, the FCC veered backwards into a 20th century, common-carrier approach to regulating a 21st century service.

“The Court appears to have left open alternative regulatory approaches that would permit ‘individualized bargaining’ between Internet access providers and websites while protecting against discrimination in favor of affiliated or preferred websites, including case-by-case adjudication of disputes if and when they arise.

“Hopefully the Commission can now focus its attention on designing such rules in a way that is more consistent with its proper, light-handed approach to all things Internet.”

Ev Ehrlich is a senior fellow at the Progressive Policy Institute and president of ESC Company, a Washington, DC based economics consulting firm:

Today’s Court decision is not a clear-cut victory for any one side in the Internet policy debate, but it is a victory for that debate.

“On one hand, Verizon, which sued the FCC, challenging their authority to regulate them, got what they wanted. The Court agreed that the 1996 Telecommunications Act protects them from being regulated as a common carrier, meaning that the FCC can’t tell them how to manage their networks. That’s a big win—the decision essentially means that if the FCC lacks the authority to mandate specific network practices such as ‘net neutrality.’

“But on the other hand, the Court agreed that the FCC, under its mandate to promote and extend the Internet (which is found in the same 1996 law), can do just about anything that the law doesn’t explicitly prohibit. So the broad authority the Court found in the FCC’s mandate may limit the FCC from taking a few specific actions (like imposing net neutrality), but doesn’t take away their seat at the table.

“What happens next?  Presuming this decision stands, one possibility is that the FCC decides to wade into the crux of the matter and classify the broadband Internet as really just another ‘telecommunications service.’ That is, the 1996 law divided communications into a heavily-regulated ‘telecommunications’ component based on the legacy phone system, and an essentially unregulated ‘information services’ component, within which the Internet burgeoned.  The FCC, urged on by neutrality advocates, could announce that the Internet was really ‘just another phone service’ and impose new regulations on it. But this risks being laughed out of court using the Frank Zappa test, as enunciated in his classic You Are What You Is—a cow don’t make ham.

“There are plenty of real issues surrounding the Internet—such as extending it to the unserved, protecting our privacy, and using it to improve our schools, health care system, local governments.  If the Court gets us past a sterile, theoretical argument over ‘neutrality’ and on to this more pressing agenda, it will have turned out to be a very positive one, and a victory for the debate itself.

– END –

Forbes: Calling America’s New Digital Pioneers

On Forbes today, prominent technology journalist Larry Downes calls for a new generation of “digital policy pioneers” to accelerate the transition from old telephone networks to the all-IP world of voice and data communication. Last month, Downes moderated a unique PPI forum that honored some of the nation’s original “digital policy pioneers” – policymakers whose decisions propelled the Internet’s explosive growth from its infancy in the early Clinton years to a World Wide Web with three billion users worldwide.

Featuring Ira Magaziner, William Kennard, Larry Irving, Michael Powell and Karen Kornbluh, that event highlighted bipartisan efforts to create an enabling legal and regulatory framework for digital innovation. Downes urged the FCC to embrace the same approach to “light touch regulation” as it tackles the unfinished business of retiring legacy telephone infrastructure and moving voice services onto IP networks.  Downes writes:

Today in Washington, there’s a new generation of digital policy pioneers. Let’s hope they can rise to the new challenges of the IP revolution. And heed the wisdom of the first generation who, thankfully, are still providing guidance and leadership.

Find the full article on the Forbes website here.

Senate Commerce Committe Testimony: Crafting a Successful Incentive Auction: Stakeholders’ Perspectives

 “Crafting a Successful Incentive Auction: Stakeholders’ Perspectives”

United States Senate Committee on Commerce, Science, and Transportation

Tuesday, December 10, 2013

Testimony of Hal J. Singer, Ph.D.

Senior Fellow, Progressive Policy Institute

 

The key policy issue facing this Committee is whether to impose asymmetric limits on the amount of spectrum that a bidder may acquire at the auction depending on the location of the bidder’s spectrum holdings—that is, whether to impose an “asymmetric spectrum cap.” In April of this year, the Department of Justice (DOJ) advocated for policies that would support an asymmetric spectrum cap designed to favor bidders that lack low-frequency spectrum. And at his first major policy speech at Ohio State last week, Federal Communications Commission (FCC) Chairman Tom Wheeler cited the DOJ’s letter in support of such limits. I want to make four simple points about the wisdom of an asymmetric spectrum cap from the perspective of a competition economist concerned with promoting consumer welfare.

First, as a condition of slanting the auction rules in a way to favor certain bidders, one must establish empirically that carriers without access to low-frequency spectrum are impaired in the ability to compete effectively. Although this particular input is not distributed uniformly across carriers, it is hard to detect any impairment in the output market. Despite its lack of low-frequency spectrum, Sprint’s net additions for contract customers were up 18 percent in 2012, and during the third quarter of 2013, Sprint’s postpaid service revenue and ARPU hit record levels. T-Mobile, another carrier that relies largely on high-frequency spectrum, enjoyed its biggest growth spurt in four years in the second quarter of 2013, adding 1.1 million new subscribers. In July, T-Mobile was gaining two subscribers from AT&T for every one it lost to AT&T. This evidence is hard to square with the notion of impairment. Continue reading “Senate Commerce Committe Testimony: Crafting a Successful Incentive Auction: Stakeholders’ Perspectives”

Washington Weighs in On Auction Move

Hal Singer, PPI senior fellow, was quoted by John Eggerton of Broadcasting & Cable on the frequency spectrum auction timetable. FCC Chairman Tom Wheeler made the decision to delay the auction until 2015, which may impact the consumer as Singer explained:

Wireless carriers are bumping up against spectrum constraints that can only be met with more equipment (which raises incremental costs) or higher prices (to manage the congestion directly),”says Hal Singer, senior fellow, at the Progressive Policy Institute. “Both options lead to higher prices, which is bad news for wireless consumers. Ideally,  we could free up additional spectrum as quickly as possible.” But, he adds: “If 2015 is the soonest possible to conduct an open, well-run auction, then I understand the delay.

The article also mentioned Singer’s upcoming testimony before congress on this issue. You can read the full article here.