Site Tools


Hotfix release available: 2017-02-19e "Frusterick Manners". upgrade now! [49.5] (what's this?)
Hotfix release available fixing CVE-2017-12979 and CVE-2017-12980: 2017-02-19d "Frusterick Manners". upgrade now! [49.4] (what's this?)
Hotfix release available fixing CVE-2017-12583: 2017-02-19c "Frusterick Manners". upgrade now! [49.3] (what's this?)
Hotfix release available fixing security token and media manager: 2017-02-19b "Frusterick Manners". upgrade now! [49.2] (what's this?)
Hotfix release available fixing install and media manager issues: 2017-02-19a "Frusterick Manners". upgrade now! [49.1] (what's this?)
New release available: 2017-02-19 "Frusterick Manners". upgrade now! [49] (what's this?)
Hotfix release available: 2016-06-26e "Elenor of Tsort". upgrade now! [48.5] (what's this?)
Hotfix release available fixing CVE-2017-12979 and CVE-2017-12980: 2016-06-26d "Elenor of Tsort". upgrade now! [48.4] (what's this?)
Hotfix release available fixing CVE-2017-12583: 2016-06-26c "Elenor of Tsort". upgrade now! [48.3] (what's this?)
Hotfix release available fixing security token: 2016-06-26b "Elenor of Tsort". upgrade now! [48.2] (what's this?)
Hotfix release available fixing authad issues: 2016-06-26a "Elenor of Tsort". upgrade now! [48.1] (what's this?)
New release available: 2016-06-26 "Elenor of Tsort". upgrade now! [48] (what's this?)
evoswitch_plenary_meeting

18th November

We will be hosted at EvoSwitch by Eric Lisica, Vicepresident of Operations.

It is very likely that they will give us a corporate spiel and because this is a unique opportunity to get some answers from an insider it would be good to keep certain things in mind.

Cloud computing — the creation of large data centers that can be dynamically provisioned, configured, and reconfigured to deliver services in a scalable manner — places enormous capacity and power in the hands of users. As an emerging new technology, however, cloud computing also raises significant questions about resources, economics, the environment, and the law. Many of these questions relate to geographical considerations related to the data centers that underlie the clouds: physical location, available resources, and jurisdiction. While the metaphor of the cloud evokes images of dispersion, cloud computing actually represents centralization of information and computing resources in data centers, raising the specter of the potential for corporate or government control over information if there is insufficient consideration of these geographical issues, especially jurisdiction.

“A reasonable question underlying all of these issues is: Where is “the cloud” located? The cloud service as an application can be anywhere and everywhere one has access to a computer. However, the cloud is actually comprised of the networked computers and servers and related infrastructure, so they physically have to be somewhere. One of the assumptions of cloud computing is that location does not matter. That is both correct and incorrect. From a cloud user’s point of view, location is irrelevant in many circumstances; from a cloud provider’s point of view, it can be of critical importance in terms of issues of geography, economics, and jurisdiction. As such, data centers must be somewhere and they can be anywhere so long as physical geography is right, but national geography is not the first consideration in many cases.”

Where is the cloud?

When asked this question, a technologist will surely chuckle and reply something akin to, “The location of the cloud is irrelevant. Anyone will be able to tap into the power of the cloud from anywhere.” This answer, while technically accurate, misses an important set of issues. The main thesis of this article is that cloud computing represents centralization of information and computing resources, which can be easily controlled by corporations and governments. We address the issue of control in the next section, and focus here on the literal answer to “where is the cloud?”

Cloud computing is, of course, a metaphor, whose origins begin with computer diagrams. The cloud itself is an abstraction and is used to represent the Internet and all its complexity. When network administrators construct diagrams of computer networks the image of a cloud is used to reference the Internet as a resource without needlessly illustrating its complexity. Therefore, the “cloud” in cloud computing represents a complex and powerful resource, which is obfuscated to its users. As the term cloud came from mapping and diagramming and was used as an abstraction, it is highly ironic that the implementation of cloud computing leads us back to talking about locations.

The power of the clouds derives from the countless computer servers that comprise it. Google, for example, is reported to own over one million servers spread across the globe to power everything from search to Web–based applications (Baker, 2007). Due to the benefits gained from economies of scale, the majority of these servers are concentrated in a handful of large data centers, each hosting tens if not hundreds of thousands of machines. And by no means is Google alone in maintaining these vast “server farms:” Yahoo, Microsoft, IBM, Amazon — essentially any company that has a significant presence on the Internet — all own and operate large data centers. It is no exaggeration to claim that these data centers represent the largest concentration of information and computing resources that the world has ever seen. In the United States alone, there are an estimated 7,000 data centers (Economist, 2008a).

The placement of data centers, each of which represents a significant investment, is a major issue for Internet companies as well as for the organizations and individuals that rely on the services run through those centers. “Data centers are essential to nearly every industry and have become as vital to the functions of society as power stations are.” [3] There are four primary considerations of where a data center is constructed:

  • Suitable physical space in which to construct the warehouse–sized buildings
  • Proximity to high–capacity Internet connections
  • The abundance of affordable electricity and other energy resources
  • The laws, policies, and regulations of the jurisdiction
  • The first three are straightforward to understand and to a large extent governed by physical constraints based on environmental variables. Considerations of physical space can include: basic physical geography (finding a suitably flat surface space, or in some cases, appropriate underground locations); climate and weather (limiting the risks of natural disasters);
  • energy–saving natural features (leveraging perhaps water, geothermal, or wind to provide cooling and power);
  • safety (locating an area that is low in crime, far from likely terrorist targets, and easy to guard against corporate espionage).

Beyond physical considerations, proximity to high–capacity Internet connections is also important, since a data center’s value is measured in the number of users that can rapidly tap into its power. Thus, it is desirable to place data centers close to the “Internet backbone,” or the main “trunks” of network that carry most of its traffic; although some companies may elect to build their own network links. Finally, since data centers consume vast amounts of energy (for powering and cooling the servers), locations with cheap energy are highly attractive. The final consideration, that of the laws, policies, and regulations of the jurisdiction, will be discussed in detail below.

As a result, many data centers are being built in locations with plentiful land, favorable corporate tax rates, and affordable electricity, often from natural resources (Foley, 2008; Gilder, 2007). Rural Iowa with its widespread wind power and rural Oregon and Washington with their ample hydroelectric power, exemplify these types of locations. Internationally, data centers can be found in abandoned mines, old missile bunkers, empty shopping malls, underground facilities, and in places as far flung as Iceland and Siberia to save on energy costs (Economist, 2008b). The key consideration for a data center location is often based around energy consumption.

During the dot.com boom of the 1990s, data centers consumed on average one to two megawatts (Katz, 2009). Now, a larger data center consumes as much power individually as an aluminum smelter foundry, with one Microsoft facility in Chicago needing three electrical substations to fuel its constant need for 200 megawatts of power (Economist, 2008b). The aluminum smelter comparison is particularly significant. Foundries can typically only smelt aluminum by electricity, meaning that the amount of electricity used is absolutely massive in comparison to other similar types of factories.

Collectively, the data centers in the U.S. consume electricity on level with a sizeable city — equal to the energy consumption of Las Vegas. Data centers consumed one percent of the world’s electricity in 2005, and the carbon footprint of data centers will surpass that of air travel and many other traditional industries before 2020 (Ahmed, 2008). By that time, networked computing may consume half of the world’s electricity (Gilder, 2007). Therefore, access to and cost of electricity is an important factor for data center locations as the price per kilowatt–hour can widely differ from region to region (Armbrust, et al., 2009).

As much as eight to nine percent of energy is lost just transferring energy to the servers themselves (Katz, 2009). Which means energy cost and energy efficiency are important aspects in data center management and ultimately towards a cloud provider’s bottom line. Many cloud computing providers focusing on harness technology to help reduce their data center’s power usage effectiveness (PUE), which is an energy efficiency metric for data centers. A PUE value of 1.0 means that the data center is completely optimal and losses almost no energy in either cooling systems or in the distribution of electricity. Currently most data centers average a PUE value of 2.0 or more (Katz, 2009). To help reduce their PUE, data centers often look to green technology or even their surrounding location to harness the local environment as a mechanism for either improved distribution or cooling (Katz, 2009).

In reaction to the energy costs and environmental impacts of data centers, cloud providers are exploring a number of alternatives to the traditional geography of a data center. The emphasis on a more environmentally conscious provision of data through cloud computing is sometimes called “following the moon” — becoming more in tune with nature can be more efficient and better for society (Perry, 2008). As examples, energy costs can be lower in the night and cooling equipment might use less energy, meaning that routing traffic to data centers where it is night could save energy, reducing costs and environmental impacts.

Another alternative is the portable or easily assembled data center. Sun has developed a modular datacenter, formally named “Project Blackbox,” which is a data center in a large shipping container (Sun Microsystems, 2007; Reimer 2007). In addition, Google and IBM are also currently exploring this concept (Jones, 2007). The key to this kind of data center is the use of specially–designed super powerful servers that can dramatically reduce the number of servers and power capacity needed for a data center. This kind of data center can be flown to where it is needed, installed, and online in potentially less than a day.

The most unique alternative approach to data centers thus far, however, has recently been proposed. Google has submitted a patent for water–based data centers that would use energy generated by the ocean to power and to cool the servers, with the ships housing these data centers positioned in international waters (Ahmed, 2008; Vance, 2008). This innovative approach has been already nicknamed “the Google Navy.” Beyond the savings of energy costs and the reduced environmental impacts, Google believes that one of the big advantages of such an approach is that the ships could be repositioned to serve in the response to a natural disaster or other kind of catastrophe (Vance, 2008). Though this idea is certainly intriguing, it seems to fail to consider some practical issues like how to defend the floating data centers from pirates, who may be drawn to pilfer the vast amounts of technology. Ironically the idea of a Google Navy was first predicted by the satirical newspaper The Onion (see http://www.theonion.com/content/radio_news/google_steps_in_to_help_u_s).

Open Questions

  • The internet is perceived as a public space where everyone can express themselves. Are datacenters part of this?
  • What is the role of governments in the building of these datacenters? Is there any advantage in building a datacenter in one country or another?
  • Can the internet run without these datacenters?
  • When did these datacenters start appearing?
  • To what jurisdictions are they subject to?
  • Are they subject to the same data retention laws as other internet service providers? Who is responsible for implementing those laws?
  • How many datacenters are there in Holland? And in the world?
  • How much energy does the average datacenter consume?
  • How many of these are powered by coal or other fossil fuels?
  • Do you have backups of everything?
  • If a ISP is hosting in your datacenter and it turns out that an individual customer has illegal content? To what extent is the datacenter reponsible for a take-down notice?

References

evoswitch_plenary_meeting.txt · Last modified: 24/11/2015 19:47 (external edit)