Showing posts with label FCC. Show all posts
Showing posts with label FCC. Show all posts

Thursday, January 15, 2015

Obama Proposes Overriding the Tenth Amendment

Obama’s speech at Cedar Falls, Iowa was like most of his speeches; much to do about nothing. He is proposing nothing short of allowing municipal governments to use taxpayer funds to compete against private enterprise, and he is encouraging the FCC to override 20 state laws in contradiction to the Tenth Amendment. I make no bones that I am a free market capitalist and I am strongly against the government taking over or competing against private enterprise. What Obama is proposing is not only anti-capitalism but also illegal.

I have written here that our current duopolies are not optimal for consumers, but replacing them with a subsidized government bureaucracy is a move in the wrong direction. I support municipal governments determining their own broadband destiny as much as I support removing restrictions allowing new entrants into the market by removing obstacles that municipalities and states have created. Twenty states have created laws to protect taxpayers from having to pay for cities failed attempts into the broadband services markets. These states realized that the communications market is competitive and fast moving. They have seen how over 50% of all municipal broadband efforts have failed leaving taxpayers to pay off creditors and bondholders (link, link). Proponents of government broadband, including the press, are quick to point out the few successes like EBP in Chattanooga and Cedar Falls, but they don’t bring up UTOPIA or Longmont, Colorado that is going for its forth attempt to provide residential broadband services. There are a variety of reasons that municipal broadband efforts fail which is why it is better to leave the risk to private enterprise.

Obama cannot instruct the FCC to just override the 20 state laws enacted to protect taxpayers. The Tenth Amendment gives states the ability to make its own laws without the federal government overriding them except for powers expressly granted by the Constitution and states. The Supreme Court has already upheld the authority of the states to prevent municipalities from providing telecommunications services in Nixon v. Missouri Municipal League by a 8 to 1 decision. He can say what he wants but case law is already pretty clear on states’ authority granted by the Telecommunications Act of 1986.

There is no question to the value of broadband services to a community, but it should be delivered in a competitive environment to enjoy all of the value that it brings. Either industry partnerships or cities should be allowed to come together to build open-access broadband fiber infrastructure as done in many cities and countries outside the United States. Sharing a common infrastructure will reduce the barrier to developing a profitable business model for a service provider; therefore, promoting competition that will benefit everyone in the community. This is the direction that Obama should be encouraging states to go.

Saturday, November 22, 2014

There May Be Hope Yet

2014-07-14-opendoorbluesky1Over the past couple of weeks, I have been encouraged that there are more people discussing open-access infrastructure than before. Maybe it is because they have started to read Title II and realize that it is not the panacea once thought. By definition more government intervention/regulation means less freedom, and the hundreds of pages contained in the Telecommunications Act of 1934 is no exception. Title II is jam packed with regulations designed around telephone service in the 1930’s when we were under the control of the Bell System. Needless to say that it will control every aspect of our Internet services. The Internet was founded based on a loose federation of networks survivable if any one link disappears. It was meant so any node could reach and freely exchange information with another node. Even the technical aspects that the Internet was designed were not called standards or regulations, they are called Request for Comment that implies they are fluid and optional. Imposing laws and regulations on the Internet is contrary to its founding principles.

The FCC has indicated that if they impose Title II regulation, it will chose which parts to enforce. If you expect the FCC to exercise any forbearance of any section, you have not been living in America long. Bureaucracies live to grow and expand their power. Eventually piece-by-piece the FCC will implement parts of Title II over the years. Also included in Title II is the allowance of paid prioritization which is a no no for many net neutrality wonks. People that support this heavy handed move either are ignorant of what it will really means or have ulterior motives such as lobbying to influence control of the Internet for their business advantage (i.e. crony capitalism).

Thankfully some people are waking up and realizing what regulation will really mean to the Internet, and they realize that competition is the real solution to the yet to be encountered net neutrality issue. Karl Bode wrote an article yesterday on Techdirt that concluded that open-access broadband is a superior choice to regulation. Leo Laporte had a well-balanced panel discussion about net neutrality on TWiT that touched on the fact that competition would be superior to regulation. My neighbor to the north, Brett Glass, delivered intelligent arguments against regulation and for open-access on TWiT. These intelligent and technical discussions are ones that we should have had two years ago when net neutrality reared its ugly head. Broadband competition can be achieved if we remove the barriers that are preventing it from happening.

Fortunately there are solutions that can be adopted that eliminate the need for heavy-handed FCC regulation: open-access fiber infrastructure. I am not discounting other means of access such as wireless, but there are impediments to wireless access as well such as spectrum allocation which is also under FCC control. I have already extolled the virtues of open-access in other posts so I will refrain from being repetitive. Suffice it to say that there are a few different models for open-access. Network unbundling is the least favorite of mine but in a pinch that will work. Most of the incumbents don’t have fiber that deep into the network to make unbundling an option. We need to build fiber to every home and well as provide spectrum to do it as well. The FCC should be working to support those efforts and eliminate barriers for new and existing entrants that want to provide infrastructure. Working on some convoluted semi-regulation scheme is a fools-errand that will only lead to lawsuits and more complaining. Let’s work on providing a competitive broadband environment instead.

Sunday, April 27, 2014

Why Do We Need the F.C.C. Involved in the Internet?

Apparently my last article hit a nerve with a few people which is strange because it was only meant to enlighten people as to the real issues concerning traffic management and peering to stimulate competition for over-the-top (OTT) providers.  I have to say that at stake here is the exact issue that I have personally encountered.  My lowly blog isn’t backed by any major media outlets or elite-funded NGO, but it reached the world and struck a nerve.  I have over 25 years experience as an engineer in the telecommunications industry developing and selling network elements.  I am not a lawyer, politician, or lobbyist who want to call the shots in the industry, but through the power of the Internet my voice has been heard.  The Internet gives those of us with real knowledge a way to be heard (and those without knowledge too). 

There are people our society who don’t want those voices to be heard or at least controlled.  They purport to champion freedom and equality yet their agenda is just the opposite.  I have no agenda other than supporting the free market and everyone’s ability to be successful on their own terms.  So what does this all have to do with Net Neutrality?  The ability for all of us with a voice to be heard are under attack by people purporting supporting net neutrality.  Let me elaborate.

In my last article, I applauded the common sense rules proposed by the F.C.C. because they allow OTT providers to compete against incumbents effectively.  I do not have anything against the incumbent carriers; I use to be part of a couple of them.  I believe that competition benefits not only the consumers, but the entrepreneurs and incumbents as well.  It is a win-win for everyone except for those that end up losing control and power due to free-markets.  I believe that the F.C.C. are proposing rules that support the free-market and entrepreneurship.  To me this is a technical argument with business implications; not a political discussion.  That is where I got it wrong.

Being an engineer by trade, I always believe that a sound technical solution and logic will prevail.  Also, I believe that people understand that competition and freedom benefit all.  It appears that with even though I have age, I am still a bit naive.  My article received some negative comments by people and even NGO that supposedly support an open Internet.  They didn’t like the fact that I was supporting these proposed rules.  There is a huge public misinformation campaign going on across the Internet under the guise that the F.C.C.’s proposed rules will kill the Internet and free-speech.  At first I believed that this effort was based on a lack of knowledge of the issues at hand, but now I realize that the people behind this campaign know exactly what is going on.  They are using the general lack of knowledge by the general public on the topic to scare them into believing that these rules will benefit the incumbents and kill the Internet, and the tech media in all in on it with them.  Their real motivation is to gain greater governmental control of the Internet so they can determine who says and does what.  These groups are disingenuous in their motivations.

This is why the F.C.C. or any governmental organization does not need to be involved in the Internet.  Government involvement always leads to manipulation by special interests and loss of freedom.  The groups purporting to protect the Internet actually will do the opposite.  There are FCC staff members that are founders of groups that are campaigning against these rules under the guise of supporting net neutrality.  Unfortunately their disinformation campaign is very effective.  We do not need the F.C.C. to further regulate and interfere in the Internet although these rules do make sense.  The Internet should remain open for all to speak freely and compete effectively whether an incumbent or OTT service provider.  Please do not be fooled into supporting a cause because it sounds like the right thing to support.  Read and fully understand both sides of the argument, and draw your own conclusion.  Things aren’t always what they seem.

Friday, April 25, 2014

Common Sense at the F.C.C.

Federal Communications Chairman Thomas Wheeler (Brian Fung/The Washington Post)

After a few months of comments by the Chairman of the F.C.C., Thomas Wheeler, that the Commission would consider allowing companies to pay for special arrangements for access to their customers.  They will propose a new set of rules in their May meeting that will allow content providers to pay broadband carriers for better access to customers.  In this statement they included another proposed rule that would prevent any carrier from inhibiting, limiting, or denying access that would limit the openness of the Internet.  The Commission was not specific on how the details of this so-called fast lane could be implemented, but most likely it will be increased bandwidth at peering points, improved content caching, and traffic prioritization (i.e. Quality of Service).  The F.C.C. was specific in stating that broadband providers “may not act in a commercially unreasonable manner to harm the Internet, including favoring the traffic from an affiliated entity.”

The F.C.C.’s action is based on a January decision by the U.S. Court of Appeals for the District of Columba Circuit that struck down the FCC’s 2010 net neutrality rules.  This still left the commission with the authority under the Telecommunications Act of 1996 to regulate broadband services.  The F.C.C. would still have to look at each agreement on a case-by-case basis as required by the D.C. court.  Also they will act on any broadband company that engages in harmful conduct that threaten the openness of the Internet.

The F.C.C. has properly covered its’ bases here by preserving the value of the Internet that allows any device to freely connect to any other device, but they also wisely recognize that all bits are NOT created equal.  Since the arrival of Thomas Wheeler at the commission, the analysis and reports from the staff engineers have triumphed over the politics of the bureaucrats and outsiders.  They realize that best-effort Internet access is not sufficient to promote true content competition.

Since divestiture we have strived for a competitive telecommunications market in this country, but we have been unable to achieve it because the business case is untenable for each service provider to build high bandwidth fiber networks to homes and small businesses.  It is affordable to run fiber to business customers that spend thousands of dollars a month on services which is why we have approximately 40% of all businesses now served by fiber.  For the residential subscriber that spends less that $50 per month the cost is prohibitive. This fact is why we now have triple-play services offered by a duopoly with an ARPU over $100 per month offered by the incumbent broadband carriers.  Service providers need at least a 40% market share to provide a reasonable ROI to build a network which is why you don’t see small start-ups building alternative networks.  Investors are smart enough to realize that it takes too much capital to build these networks and it is a losing proposition to take on the incumbents.

Fledgeling startups like Vonage or Netflix could never afford to build their own network, but they can leverage the Internet to provide competing services to the incumbent carriers.  The incumbent carriers control the quality of their services by utilizing bandwidth outside of their Internet service to deliver their voice and video services.  Over-the-top (OTT) service providers do not have any ability to control the quality of their services since currently the Internet is a best-effort where all bits are created equal.  Before delay and jitter critical services like voice and video traversed the Internet, best-effort was good enough because no one really tell if their web page or e-mail was arriving a few milliseconds later than it did last time. Slow performance was usually related to the last-mile access bandwidth.  Throw a few more Mbit/s at a customer and the problem was gone. 

Now that video dominates the Internet, the backbone frequently becomes saturated at peering points and access;  thereby, affecting all traffic.  Customers of OTT companies are complaining that the quality of the service is poor, and they eventually go back to the incumbent service provider.  The OTT loses while the incumbent wins.  The customer loses too because there is less competition in the market.

The conclusion is that best-effort packet delivery is not good enough for services like voice and video.  Businesses have known that for over a decade which is why they purchased managed Ethernet services where they can prioritize their voice traffic over video over web surfing and e-mail.  There are two standards by which traffic can be prioritized end-to-end and several implementation agreements that the industry uses for interoperability.  These same mechanisms can be applied to the Internet to level the playing field for OTT service providers.

The F.C.C. is smart to recognize that true service provider competition will take a few more decades to come to the residential market.  Of course open-access municipal broadband could deliver true competition as I have written about many times, but allowing content providers to negotiate special peering arrangements and traffic prioritization will offer consumers a real choice in services other than that forced upon them by the duopolies.  OTT service providers like Netflix, Hulu+, Amazon, Google, Vonage, etc. will soon be able to deliver the same quality of service as the incumbent carriers at competitive prices.

Unfortunately this common sense technical solution to enable capitalism has been extremely politicized.  Many of the articles written over the past few months are vehemently against this proposed change in policy, but their fears are fueled by their ignorance and vested interests.  Surprisingly The New York Times and PC World each wrote very good and non-biased articles on today’s announcement that accurately presented the F.C.C.’s proposed new rules.  The usual tripe was spewed by outlets such as The Verge, NPR, The L.A. Times, and CBS’ own CNET decrying the end of the Internet and quoting any number of Soros funded front groups.

Their arguments are based on the egalitarian philosophy that all bits should be treated equal.  They trot out anti-capitalist rhetoric and class-warfare arguments.  What they do not realize that while they think they are sticking up for the little-guy (the consumer and OTT providers) they are actually supporting the big guy (the incumbent).  Maybe I am not giving them enough credit and their support is intentional. 

A prime example of the the misinformation that is propagated was on today’s The 404 Show hosted by CNET.  Bridget Carey (@BridgetCarey) incorrectly states that the little guy cannot afford to pay the toll to Comcast that the large companies could easily pay.  Well Netflix was and still is a little guy compared to companies like Comcast, but they charge an order of magnitude less than the big guy so adding a couple of bucks a month for superior quality of service insignificantly impacts their value.  She leaps to the conclusion with the support of her cohort Jeff Bakalar (@JeffBakalar) that there will be an Internet ghetto for those companies and people that cannot afford to pay. What they are missing is that only companies with time-sensitive content will want to pay for prioritization, and that companies just serving up web pages like Amazon, Facebook, LinkedIn, etc. can still survive on a best-effort service.  Jeff believes that the Internet should be free and that all service and content providers are inherently evil.  Their arguments were thinly veiled slams at Comcast which is no surprise since they are paid by Viacom/CBS.  This is the problem when you have journalism majors applying their political philosophies to the technical domain.  They certainly should not be issued a journalism license.

The problem is that arguments like these will be presented as opposition to the common sense rules proposed by the F.C.C.  They will be guided by emotions and fear and not facts which seems to dominate today’s political domain.  Thomas Wheeler is the first Commissioner in more than a decade that actually understands the industry that he is attempting to regulate.  Let’s hope that the rest of the Commission understands reason so we can have a truly competitive content market.

Friday, September 03, 2010

The Wrangling on Net Neutrality Continues This Week

The debate on net neutrality rages on this week with AT&T checking with their position on the topic.  Not only did they effectively state their case for differentiated services, they also addressed the inaccuracies in the positions of the political opposition groups.  Mr. Hultquist noted, in his blog post, that the position of net neutrality groups like the Church of Extreme Net Neutrality (CoENN) will make the Internet a “dumb network.”  I applaud AT&T for coming out in support of differentiated services and backhandedly supporting the Verizon/Google principles.  Their article took the direct approach to dispel the myths of political opposition groups.

Declan McCullagh of CNET wrote a rather objective piece on AT&T’s announcement covering the basis for AT&T’s position.  In the article he presented an opposing views from groups like Free Press.  Mr. McCullagh shows the astute reader that AT&T’s position is based in technical facts while the Free Press’ position is based on opinion with no historical or factual basis.   I hope that CNET and the rest of the press will continue to provide objective reporting on the topic and continue to produce well researched articles like this one.

The FCC took some action this week requesting more data from Google and Verizon as reported by ARS Technia.  Anti-differentiated services proponents chastise the FCC for dragging its feet, but I think that it is giving industry time to align itself and reach an agreement.  I am sure that they will not public admit to this strategy, but their passive role and quiet support of differentiated services in their Broadband Performance report seems to support my supposition.  The FCC is treading lightly because it knows that net neutrality is a political hot potato, and if they take no action, then the political opposition groups will utilize the President to put pressure on the FCC.  They realize that their legal authority to implement net neutrality is weak so they will drive the industry through their inquiry process.  If they push the industry to address wireless networks as well then they can claim credit for being an active participant. 

Companies continue to come forth in support of differentiated services.  Hopefully in the coming weeks more content companies will release statements.  I would like to see a content provider like Vonage, Netflix, or Disney chide in the debate.  If these companies realize how differentiated services can allow them to compete and create new content delivery models, then the political opposition groups will not have much ground to stand on.

Saturday, August 21, 2010

The FCC Has Recognized the Need for Differentiated Services

Last week the FCC published its’ report on U.S. broadband Internet usage entitled Broadband Performance: OBI Technical Report No. 4.  The press chose to report on the sensational claim in the Executive Summary that actual measured bandwidth was half of the advertised bandwidth.  If they would have taken the time to read past the Executive Summary or not copy the other articles written about the report, they would have noticed that the report supports Quality of Service (QOS);  thereby, implicitly endorsing differentiated services.  They even dedicated Appendix 3 to a cursory discussion of QOS.

In Section I, the concept of QOS is first mentioned when profiling the different types of traffic users download.  In the quote below, The FCC states that high definition video needs bandwidth and QOS.

At the high end of the range, an application such as enhanced high definition (HD) video teleconferencing could require 5–10 Mbps, or more along with significant quality of service (QOS) performance (see Exhibit 9, where “Symm.”—short for symmetrical—indicates that the download speed is also required for upstream traffic).

In the next paragraph they reveal the other parameters that are required for HD video conferencing.

Download speeds are only one measure of broadband performance.
For example, HD quality videoconferencing requires very fast upload speeds to allow a person to transmit her image and voice while simultaneously receiving the image and voice of another person. In addition to upload and download speeds, measures of QOS such as availability, latency and jitter (variation in latency among different packets) may be important. Some applications, like e-mail or text-based Web surfing, are generally insensitive to these other measures of network performance, but for other applications, such as videoconferencing, these measures may be important (see Exhibit 10).

These statements introduce the reader to the concept that bandwidth alone may not be sufficient for certain types of services.  Later in Exhibits 9 and 10, services are classified by their need to be experienced immediately along with the need for QOS to determine user experience.  The FCC is unequivocally stating that all bits are not created equal.  They identify real-time and near-real-time traffic as needing lower packet loss, latency, and jitter from typical web browsing or e-mail reading.  The FCC’s quiet endorsement of differentiated services came in the beginning of Section III by stating:

The NBP therefore relies on a National Broadband Availability Target defined in terms of quantified download and upload speeds, with qualitative reference to a QOS consistent with the delivery of voice and video applications.

Perhaps the reason why the FCC was dragging their feet on net neutrality regulation was that internally they actually support differentiated services.They realize that it can improve overall user Internet experience and provide real competition to the incumbents.  By letting Google and Verizon publish their principles of net neutrality, they let those two take the flack for supporting differentiated services instead of staff having to deal with the political fallout.  Whatever the reason, I am glad that the bureaucrats recognize how the application of QOS will benefit the Internet.  Too bad the press missed it. 

Saturday, April 10, 2010

Is The Court of Appeals Decision in Comcast v. FCC Good for Net Neutrality?

Much was written this week about the U.S. Court of Appeals for the District of Columbia’s decision against the FCC fining Comcast for blocking BitTorrent traffic in 2008.  Most of those articles missed the point of the decision and declared that the FCC cannot regulate the Internet.  This decision said one thing, and one thing only:  the FCC overstepped its enforcement authority in telling Comcast how they can manage their network.  It did not vindicate Comcast in blocking BitTorrent traffic nor say that the FCC cannot create regulations and enforce them on Internet services.  It just set a limit on where the FCC’s enforcement ends based on their past actions.  Specifically the court stated that the FCC did not have ancillary authority to regulate Comcast's network management practices.1  It is expected that the FCC will appeal the case to the Supreme Court.2

On the surface it may appear that Comcast and other Internet Service Providers (ISP) are winners and the public is a loser.  That interpretation is not entirely accurate when you take a longer-term perspective.  The backlash from the decision may be worse than the decision itself.  The court itself made it a point to support the necessity of a free-and-open Internet as noted from this statement by the FCC:

"The court in no way disagreed with the importance of preserving a free and open Internet, nor did it close the door to other methods for achieving this important end," said FCC spokeswoman Jen Howard.3

The court’s decision prompted an immediate backlash from the press, consumer groups, and lawmakers for Congress to take action to remedy the situation.  That remedy could range from having Internet service reclassified as a telecommunications service which gives the FCC the necessary authority or a law defining “net neutrality” and other aspects to regulate the Internet.  All of them come with consequences that could restrict innovation and unfettered use of the Internet.

The FCC itself thwarted its own ability to regulate Internet services when it classified them as the less regulated Title I services.  I believe that this was the most appropriate action for them to take because it limited their authority to regulate.  If it would have kept them at a Title II service, then they would have been within their jurisdiction to regulate Comcast’s and other ISP’s traffic management techniques.  This action would have stifled innovation and the delivery of new services because the service providers would have opted for more restrictive services and information providers like Google would have had to fight it out at the FCC and courts.  If the FCC attempts to reclassify Internet access as a Title II service expect to see this type of behavior.

The alternative is to get Congress involved and have them legislate the definition of net neutrality and expand the FCC’s powers even more.  Although this may be what the EFF and other consumer advocates want, the most likely scenario is that the resulting legislation is something that nobody wants, and even could be contradictory to the principles of net neutrality.  Almost every Congressman does not understand the nuances of the issues that distinguish an application/site/service from data transmission.  I have written at length on my belief of net neutrality and the FCC has come out with a higher level statement that does not contradict my principles. 

I clearly believe that this issue should stay under the jurisdiction of the FCC and that the FCC needs to clearly define the rules of net neutrality with the hands-off approach that made the Internet what it is today.  The Congress does not have the expertise nor is it the proper forum for industry, regulators, and consumers to come together to define how to keep innovation and commerce flowing on the Internet.  The FCC needs to go through the proper rulemaking procedure so it can enforce these principles.  Service providers need the ability to manage traffic on their network to ensure a quality experience for all customers and consumers need the ability to access any lawful service over these networks equally whether they are provided by the network provider or a third-party.  The best way to achieve this balance is to have true competition in the access network.  Regulation is a last resort when there is no competition and apparently I am not alone in my opinion. 

My next article will discuss how Google is doing more to stimulate competition than  the National Broadband Plan.

Friday, December 11, 2009

The FCC’s Still Attempts Interoperability Standards for Public Safety

Wednesday Silicon Flatirons sponsored its latest presentation in the Center’s Policymaker Series.  Retired Rear Admiral James Arden Barnett, Chief of Public Safety and Homeland Security Bureau FCC, outlined his bureau’s role in specifying public safety interoperability requirements in the National Broadband Policy that we anxiously await for release next February.  The Chief shared with the audience that the FCC will be drafting interoperability requirements for public safety broadband networks, and that they are considering several models of which to build and fund these networks.  Adm. Barnett expressed his desire for openness but he did not stay long for questions or provide the audience with any contact information to his team.  My impression is that this is the same openness we are seeing from other parts of this administration.

Politics aside, there are two themes that were prevalent during the reception after the talk: lack of local public safety and industry input and the belief that the best option to create this network was through Federal government funding.  In the Chief’s defense, he was going to visit Intrado in the afternoon.  During his talk, he rattled off a list of government agencies that he planned on consulting for drafting the interoperability requirements, but not once did he mention the TIA, IEEE, IETF, or other industry standards bodies.  Our industry has a long successful history in creating interoperability standards from the SONET Interoperability Forum to the WiFi Forum, Metro Ethernet Forum, etc.  These organizations are comprised of all stakeholders in the process especially the ones developing the technology.  Noticeably absent from the process was first responders.  They are the eventual customers of this process and need to state their needs.  Each organization and locality has different needs, and the standards need to remain flexible enough to account for them which leads me to my next point.

Public Safety networks are typically funded and built locally and regionally.  They are not something built from Washington.  I applaud the FCC acting as a catalyst for creating interoperability requirements, but they cannot dictate technology and products.  The one size fits all approach will not work in a country as diverse as ours.  Adm. Barnett hinted at specifying LTE as a technology for building broadband public safety networks.  The FCC should focus on application layer interoperability issues and not specific transport layer technologies.  The resiliency of the network will come from the diversity of transport technologies utilized.  Also, he had the belief that commercial networks may not be as reliable as dedicated government run networks.  May I remind the Chief that it was the Nextel iDEN network that held up the best during the 9/11 attack, and that our national defense plan relies on commercial networks during time of emergency. 

A great example for a interoperable broadband public safety network is in NYC.  DOITT has done an excellent job utilizing private and commercial facilities to build a IT infrastructure for the city.  I recommend that the bureau spend more time with this organization to learn how they built their network, and use it as a model where other cities may follow. 

Without industry input, I am afraid that this 10-20 page addition to the National Broadband Policy will be another vague government edict that will not get us any further than when this idea originated 8 years ago.  If this is an area of interest for you and your company, I suggest contacting the bureau directly and provide your comments.

Tuesday, November 24, 2009

Taking Ownership of the Rural Network

I read two stories today that affirm my commitment that local governments should build their own network infrastructure and sell access to private service providers.  The first article from TelecomTV discusses how the two largest LEC in the U.S., AT&T (T) and Verizon (VZ), are reducing investment and even neglecting their rural networks.  Verizon’s sale of their land-line assets in rural areas to Fairpoint and Frontier are witness to their strategy.

So what are we to do?  Should the FCC reinitiate Universal Service and force the LEC to serve these smaller, less profitable markets?  Do we institute unbundling to all LEC like the second article suggests?  These heavy-handed government solutions are just taxes that produce minimal results in a marketplace that is trying to be competitive.  The best solution is to let municipalities or other local government entities build and take ownership of their infrastructure.  It is commonly done with roads, sewer, water, and electric utilities.  Why can’t we do it with telecommunications?  It is the next infrastructure.

Allowing local governments to build their own infrastructure means that they can take advantage of their long-term financing capabilities where a private corporation’s shareholders would expect a quicker payback.  The municipality in turn sells access to this network on a non-discriminatory basis to any service provider that would like to sell services.  This open access network enjoys a >65% utilization bringing down the time to a positive ROI to 5-7 years.  At that time the governmental entity starts making money.  Cash flow becomes positive in just a couple of years enabling them to build out the entire area over time.  This objective can be completed while being taxpayer neutral.

The city enjoys the increased economic benefits provided from the network and true competition from service providers.  Traditional service providers such as Verizon, AT&T, Comcast, Time Warner Cable can still provide services to the community just as they would with their single-purpose network.  Additionally new service providers can now offer services to compete with the traditional service providers.  Imagine Verizon offering FiOS in Boulder.

This approach reverses the trend from looking at a solution from the national level to the local level.  Localities are free to choose an architecture, technology, and implementation plan that suits their needs; not have one dictated to them by the FCC or some other federal agency.  Yes there are challenges to funding, planning, constructing, and operating these networks.  Subsequent articles will address those issues.

It is time for the U.S. to take a new approach to its broadband strategy and enable communities to take their destiny into their own hands.  There are already several instances where such a business model is already working in the U.S.  The problem is that many cities face legal challenges from incumbent providers trying to protect their outdated business models.  It is time for them to realize that the service is not the network.  Take a lesson from Google where their services are utilized over 3rd party networks exclusively.

If the FCC has to exert a heavy hand, let it be to abolish any restrictions preventing localities from building and operating open-access broadband infrastructure.  Private enterprise will step in and assist these communities to build these networks just as it did in the early 20th century when electrifying America.