Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Data Center With a Brain: Google Using Machine Learning In Server Farms

Unknown Lamer posted about 8 months ago | from the skynet-online dept.

AI 26

1sockchuck (826398) writes "Google has begun using machine learning and artificial intelligence to analyze the oceans of data it collects about its server farms and recommend ways to improve them. Google data center executive Joe Kava said the use of neural networks will allow Google to reach new frontiers in efficiency in its server farms, moving beyond what its engineers can see and analyze. Google's data centers aren't yet ready to drive themselves. But the new tools have been able to predict Google's data center performance with 99.96 percent accuracy."

Sorry! There are no comments related to the filter you selected.

No literature review? (2)

tabate (3593817) | about 8 months ago | (#47110163)

ML has already been proposed to improve the performance and resource efficiency of large-scale datacenters. Detailed information on two of the most well-known examples from Stanford and Berkeley can be found below: http://engineering.stanford.ed... [stanford.edu] http://www.eecs.berkeley.edu/P... [berkeley.edu]

Re:No literature review? (1)

OOSCARR (826638) | about 8 months ago | (#47110481)

Jim's approach is for hardware monitoring.

New tool? (0)

Anonymous Coward | about 8 months ago | (#47110171)

we’ve hit upon a new tool: machine learning Mmh... Machine Learning isn't exactly a new tool...

Re:New tool? (1)

jythie (914043) | about 8 months ago | (#47111061)

Yeah, but when you work in an industry that fetishizes 'self learning' and often has a dim view of academic things like theory or history, things often seem like new ideas since they do not actually know much about what has already been done.

So... (0)

Anonymous Coward | about 8 months ago | (#47110185)

Level of Skynet alert is...???

Re:So... (1)

Anonymous Coward | about 8 months ago | (#47110197)


Re:So... (1)

Samuele Kaplun (2908495) | about 8 months ago | (#47110213)

Level of Skynet alert is...???

That depends... can they already wire in new hardware without any human intervention?

Re:So... (1)

geekoid (135745) | about 8 months ago | (#47110791)

You design chips for a living. A design appear on your desktop that's next generation. You can delete it, or look like super hero genius at work and submit it.

You are a CEO, and in front of you, with very little man power sits the design, created by one of your people, for the next chip design. Due you trash it or make billions?

You're a tech. You get a work order to install something. The box arrives, and you install it, or risk getting fired?

so, in way, yes.

A little top heavy! (3, Insightful)

m00sh (2538182) | about 8 months ago | (#47110205)

The article seems top heavy ... meaning the article has all the emphasis on "machine learning in server farms" and way too little on what it actually produces. Some fuzzy paragraph on cooling methods when some servers are taken offline.

You could use "machine learning for peace in the middle east", or use "machine learning for fixing the economy" but unless it produces real results, it's just an experiment.

Re:A little top heavy! (1)

Branciforte (2437662) | about 8 months ago | (#47116475)

I actually work with Jim Gao. His design doc was already open in another tab when I saw this article. Jim's a really smart guy. Really nice guy too.

I can't talk too much about it. You have a huge amount of electricity coming into the DC, on the order of a lightning bolt, and it has to be intelligently choreographed to make the best use of it. Then you have to carry away the heat. There is a lot of machinery to do that, and by accurately predicting where and when power is going to be needed, both for servers and for cooling, you can allocate your resource (chillers, transformers, fans, etc...) most efficiently. It's complex. It involves weather patterns and coolant flow rates and even minor things, like when the heaters turn on to keep the oil in the generators at the right viscosity in case of an emergency. You need to know all that stuff to decide where to best route (provision) the power. If you statically allocate power to cooling and other subservient systems, you lose the opportunity cost to use that power for more server machines. The more intelligently you can control things in real time, the more efficient you can be.

The coolest part is Jim had a clever idea, wrote it up, and it became reality based on its own merits. Jim is relatively new and junior at Google, but that didn't matter at all.

Machines making machines. How perverse! (1)

mmell (832646) | about 8 months ago | (#47110221)


Neural Network? (1)

Graydyn Young (2835695) | about 8 months ago | (#47110273)

I'm very curious as to why they are using a neural network for this. I'm no machine learning expert, but I was under the impression that neural networks were somewhat outdated. And yet it seems like Google is spending rather a lot of time with them lately.

Re:Neural Network? (3, Informative)

Anonymous Coward | about 8 months ago | (#47110363)

Not outdated - just that they work well only with certain use cases. I know that SMART whiteboards use neural networks to process camera input in identifying if a finger or a pen is being used to write something - and they are very good at that. No matter how you hold the the pen they will recognise it, and never confuse a finger for a pen. They are fussy enough that I've been unable to duplicate a pen for myself with a 3D printer.

But they are no miracle solution. There are plenty of cases where you might think a neural network would work, yet it fails dismally. The research in recent years has been into the properties of much, much larger networks - something made possible by access to a lot more processing power. It hasn't been terribly promising. They are unpredictable.

Modeling outpaces practical feedback (4, Informative)

ttsai (135075) | about 8 months ago | (#47110345)

Artificial intelligence and neural networks are a hot topic, so this is piggy-backing on that trend. It's not a surprise that Andrew Ng's work is referenced quite a bit.

While the modeling is interesting, it's seems to be just modeling at this point. The main claim of the white paper is high PUE prediction accuracy by the model. While that's academically interesting, the real use is in feedback for optimization. The white paper author realized that and included that optimization problem as one of the examples in the paper. However, the optimization was achieved "through a combination of PUE simulations and local expertise." I'm guessing that the local expertise part was relatively significant because there is basically no discussion of this even though it is the one application that would really make this work practical and really interesting. The paper claimed that this neural network-based optimization reduced PUE "by ~0.02 compared to the previous configuration." But, I have no idea how that would have compared to optimization using just local expertise without the benefit of neural network modeling.

Re:Modeling outpaces practical feedback (1)

sckienle (588934) | about 8 months ago | (#47112739)

Hardly new, though. People have been trying to show niche success of AI and NN can translate into major success for years, at least since the 1980s.

Except for that one time .... (1)

tommeke100 (755660) | about 8 months ago | (#47110493)

It's probably fairly easy to predict usage. They've been doing it for ages with the electricity power grid.
But what will happen when a singularity arises?

Re:Except for that one time .... (1)

geekoid (135745) | about 8 months ago | (#47110633)

nothing. The singularity requires resources. Resource humans need to provide. So while you may have a system that designed smarter systems, assuming it's possible to do that, it' snot like they will magically appear everywhere.

Singularity is a largely overblown issue that fits right into the same meme that infect humans about religion.

Re:Except for that one time .... (0)

Anonymous Coward | about 8 months ago | (#47110743)

Resource humans need to provide.

Until one mine too many blows up and people switch to machines for mining.

* Resource acquisition
* Transport
* Refinement
* Part-picking
* Assembly
* Programming

Automating these are necessary (but not sufficient) for the robotic overlord "singularity".

Re:Except for that one time .... (1)

geekoid (135745) | about 8 months ago | (#47110995)

hey look. power lines.
Hey look, a back hoe
Hey look, sparks.

GSA - Google Security Administration? (1)

DaWhilly (2555136) | about 8 months ago | (#47110597)

Google is only using metadata and not actual server data for their analysis to determine threats to server stability, right?

ob. Link (1)

geekoid (135745) | about 8 months ago | (#47110607)

Re:ob. Link (1)

Greyfox (87712) | about 8 months ago | (#47114861)

I kind of like the XKCD one [xkcd.com]

Yu0 Fail It (-1)

Anonymous Coward | about 8 months ago | (#47111725)

Re: Yu0 Fail It (-1)

Anonymous Coward | about 8 months ago | (#47113837)

Mod parent up

And bring back the brackets in mobile view.

Lol. (0)

Anonymous Coward | about 8 months ago | (#47112451)

Developing himself out of a job. Anyone remember The Twilight Zone with Mr. Whipple whose machines replace everyone and finally him?

The Forbin project... (1)

the_rajah (749499) | about 8 months ago | (#47113753)

Colossus. Need I say more?
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?