dslreports logo
site
 
    All Forums Hot Topics Gallery
spc

spacer




how-to block ads


Search Topic:
uniqs
32
share rss forum feed


DarkLogix
Texan and Proud
Premium
join:2008-10-23
Baytown, TX
kudos:3
reply to techguy2012

Re: New Data Center Build out..

said by techguy2012:

Your windows are going to be competing with the air conditioning in that space too... unless... are you going to be putting solar panels in all the windows - I'd consider that exciting. Or what about humongous LED signage - also cool.

I'll just keep guessing amongst myself, until you tell us

You've made me think of a cool but expensive idea.

Replace those lights with LED panels, I know Leo laporte uses LED panels for the twit studio and they mentioned how nice it was because they're a lot cooler yet just as bright as normal studio lights.

I mention this because with those windows lowering the heat entering the room or being made without need (ie heat from non-POWER/Computer/Cooling loads) should be atempted to be lowered as much as possible.

ke4pym
Premium
join:2004-07-24
Charlotte, NC
Reviews:
·Northland Cable ..
·Time Warner Cable
·ooma
·VOIPO
·Verizon Broadban..
You don't need whole panels. I was in a Facebook data center and their lighting is all LED, but the fittings they were in looked more like traditional lighting fixtures.

As you approached an area, the lights would come up in front of you and dim behind you.

Studios have been using fluorescent lighting for darn near a decade now. Those things don't put out a whole lot of heat.


DarkLogix
Texan and Proud
Premium
join:2008-10-23
Baytown, TX
kudos:3
Well I've heard that people get hot under tv studio and movie studio lights

ether way the idea I was making was to lower any all heat not from servers.

wolfman2g1

join:2008-04-28
Brooklyn, NY
The problem here is 1. This is our first data center and no one on the net ops team has ever built one from the ground up. 2. we have a strict deadline so we can't change anything now.

We are learning about a lot of things as we go and we are actually planning on creating a spec for all future data centers. We've been taking cues from the big guys like google and facebook and we are trying to incorporate as much of their conventions that we can at this stage in the game, like running the cooling at higher temps (75 -85 degrees) and labeling every cable. One of the biggest issues we came across is cable management, I personally suck at it so I don't have any real insight here, most of what I've been doing I've actually learned from the bbphotos and from this forum. What I've never really seen is how people deal with different types of cables, we have conventional cat 6, fiber bundles, jumpers and SFP+ cables, they all have their own intricacies so i haven't figured out a good way to manage them especially since some servers have both SFP+ and cat 6 and some switches have all three.

we've investigated also things such as raised floors however i think that one is still up in the air. or possibly ducting the a/c directly into the racks but that seems pretty expensive.

it's my personal goal and that of my boss to be able to start developing LEED certified installations within five years.

tomdlgns
Premium
join:2003-03-21
Chicago, IL
kudos:1
bbphotos? another forum or what?