Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

Yahoo Data Center plans growth in Grant County

Kelly Schott, a site operations technician, works on one of the thousands of servers inside the Yahoo Data Center in Quincy, Wash., on May 7. About 40 people work at the site, one of three major data centers owned by Yahoo. (Jesse Tinsley)

When the World Cup started last month, the little blue lights on thousands of racks inside Yahoo’s Quincy data center began flickering faster than usual. 

Millions of people on phones, tablets and computers were tracking teams and watching video from the event, a once-every-four-years sports spectacle that garners intense online interest.

Much of that Web traffic pulsed in and out of Yahoo’s 400,000-square-foot data center, which sits near farm fields in the heart of Grant County.

Built in 2007, Yahoo’s building is filled with long rows of servers whose tiny LED lights flicker as data flows in and out of the center.

Such buildings are the physical manifestation of the digital world of zeroes and ones, the basic molecules of online activity.

The activity inside allows people to check Yahoo mail, do Web searches, share links, view videos, post photos to a Tumblr page or upload pictures to Flickr – two sharing sites Yahoo owns.

Quincy was the first of Yahoo’s owned and operated data centers. Before that, the Sunnyvale, California, company leased server space inside other firms’ data centers.

Today, Yahoo has more than a dozen data centers worldwide, said Suzanne Philion, a senior manager for Yahoo public relations.

At the same time Yahoo was building its center, Redmond-based Microsoft also started its own data center in Quincy.

Since then, five other companies, including Dell and Intuit, have joined the Grant County club. All are there, their officials say, because of very low hydropower rates, ample fiber connections and inexpensive land. The industrial power rates they pay are among the lowest in the country, provided by two dams operated by Grant County Public Utility District.

Most data center owners don’t allow visitors inside their buildings, citing security and customer-protection concerns.

But Yahoo’s data center team recently agreed to permit an extended media visit to the Quincy site.

The tour, which occurred in May, covered the entire building during normal daytime working hours. Some questions were not answered, such as the number of servers and total number of CPUs inside the Quincy center.

Yahoo officials say the Quincy center is designed to continue to grow, with plenty of room for expansion.

While protective of what happens inside the data center, the global tech company wanted to open the doors to showcase innovations and its skilled workforce, Philion said.

“We’re proud of our facility here. And importantly, we’re growing our team,” she said. “We want to get the word out what we do, what it looks like and also spread the word that we’re hiring.”

The Yahoo center is made up of two adjoining buildings. The second building was added in 2011.

There are practically no windows, except for a panel of glass along the front wall near the main entry.

The company employs just more than 40 workers. Most of them gather during lunch hours at a common dining area near the front of the building. Yahoo provides free lunches from Quincy-area restaurants or caterers.

It operates with a handful of managers; most of the staff is comprised of technicians, computer operators and facility support staff.

The cooling coop

Data farms or data centers consume immense amounts of energy. Most of that power is consumed running all the computers and network devices inside. But companies like Yahoo and Microsoft also are using power to pull hot air out of server rooms to keep the devices functioning properly.

In a console room at one end of the large main building, network managers watch displays that keep them aware of how data is transferred in and out of the center.

In the building’s Server Room One, a technician holds a laptop computer and examines information on the screen. The laptop is plugged via cable into one of the servers sitting on a rack. He’s diagnosing a problem that could be either the server starting to fail or a networking bottleneck that’s keeping it from operating.

Every server on the rack is labeled by a numeric code to help locate it among the vast number of devices spread across the two buildings.

Josh Schuyleman, Yahoo’s manager of data center operations, said most problems are hardware failures or network disruptions.

Because Web traffic can be rerouted easily when hardware or networking problems pop up, most Yahoo users won’t notice when a data center problem occurs, he said.

“You might only notice, if you’re trying to load a page, a very, very short momentary interruption or delay,” he said.

Companies that do most of their business online find themselves always trying to manage the increasing glut of customer data that comes with personal information, videos, photos, spreadsheets, blogs and business backups.

The online customer doesn’t want any delay in finding or using that data, Schuyleman added.

“It’s no longer OK for a page to load slowly. People used to be patient while it took some time for the old AOL start pages to load. Not anymore,” Schuyleman said.

Brian Huck, Yahoo’s Quincy data center manager, said the company replaces its servers every three to four years. While expensive, that plan ensures the servers operate more reliably, he said.

“The new servers use less power, they have bigger drives and you’re maximizing your energy,” Huck said.

One of the tour leaders was Chris Page, Yahoo’s global director for energy and sustainability strategy. Page explained how the Quincy center, over the past seven years, has gone through two major energy-efficiency upgrades.

In the building’s Server Room One, the company uses the standard data center cooling method of chilling the room. To keep the servers running properly, Yahoo pumps in cool air and tries to maintain a steady room temperature.

Page said Yahoo installed 1,200 temperature sensors around that room to identify the hot and cold areas and redistribute air for efficient cooling.

In Server Room Two, Page said Yahoo’s next cooling system involved reconfiguring the entire room, dividing it into hot aisles and general work areas.

Instead of cooling the entire room, that plan creates enclosed areas that wall off the heat-producing servers from everything else.

Each enclosed aisle is roughly 15 feet long by 9 feet tall and 4 feet wide. The racks of servers are pumping heat into the enclosed aisle where cooler fans are pulling that heat out and circulating through the rest of the non-cooled main areas of the room.

Page said the first cooling system is like cooling a glass of milk by cooling the whole room. “Think of this (second) way as placing a glass of milk inside a refrigerator instead of cooling the whole room,” Page said.

The cooling upgrade was incorporated into the design of the building added in 2011.

That building has the name Yahoo Computing Coop. The engineer who helped design the system said he drew inspiration from studying a backyard chicken coop’s upward-sloping roof.

The Coop building has a series of wall louvers that allow fresh air to enter the data center from the outside, taking advantage of the cool mornings and evenings in Quincy.

That air then flows through two rows of cabinets on the main floor, and as the air warms, it rises upward toward the slanted ceilings of the building.

The heat is funneled out of a long narrow chicken-coop-inspired cupola on the roof.

On most days, the Coop system is so efficient that practically no additional energy is needed to reduce heat in the building, Page said.

Philion said the Quincy center operates with a power usage effectiveness rating of 1.2 to 1.3.

All companies using data centers try to push that PUE score as close as possible to 1.0, which would mean that all the power it uses is put to use by the server equipment, and none consumed in operating cooling systems.

In the first generation of data centers, a 2.0 PUE was considered normal.

Not anymore, Philion said.

“Our team is continually working to ensure that our PUE is as low as possible.”