Trends In Data Center Design - ASHRAE Leads The Way To .

Transcription

Trends in Data Center Design –ASHRAE Leads the Way toLarge Energy SavingsASHRAE Conference, DenverOtto Van Geet, PEJune 24, 2014NREL/PR-6A40-58902NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC.

Data Center EnergyData centers are energy intensive facilities. 10-100x more energy intensive than an office Server racks well in excess of 30 kW Surging demand for data storage EPA estimate: 3% of U.S. electricity Power and cooling constraints in existingfacilities.Photo by Steve Hammond, NRELData center inefficiency steals power thatwould otherwise support compute capability.2

BPG Table of Contents Summary Background Information TechnologySystems Environmental Conditions Air Management Cooling Systems Electrical Systems Other Opportunities forEnergy Efficient Design Data Center Metrics &Benchmarking3

Safe Temperature LimitsCPUs 65C(149F) GPUsMemory 85C(185F) 75C(167F)Photo by Steve Hammond, NRELCPU, GPU & Memory, represent 75-90% of heat load 4

Environmental ConditionsData center equipment’s environmental conditions should fallwithin the ranges established by ASHRAE as published in theThermal Guidelines.Environmental Specifications ( F)(@ Equipment Intake)TemperatureData Centers ASHRAEHumidity (RH)Data Centers ASHRAERecommendedAllowable65 – 80 F59 – 90 F (A1)41 – 113 F (A4)42 F DP –60% or 59oF DP20% –80% & 63 F DPASHRAE Reference: ASHRAE (2008), (2011)5

Equipment Environmental SpecificationAir inlet to IT equipment is theimportant specification to meetOutlet temperature is notimportant to IT equipment6

Key NomenclatureRecommended Range (Statement of Reliability)Preferred facility operation; most values should be within this range.Allowable Range (Statement of Functionality)Robustness of equipment; no values should be outside this range.RACK INTAKETEMPERATUREMAX ALLOWABLEMAX RECOMMENDEDMIN RECOMMENDEDOver-TempRecommended AllowableRangeRangeUnder-TempMIN ALLOWABLE7

2011 ASHRAE Thermal Guidelines2011 Thermal Guidelines for Data Processing Environments – Expanded Data Center Classes andUsage Guidance. White paper prepared by ASHRAE Technical Committee TC 9.98

2011 ASHRAE Allowable RangesDry Bulb Temperature9

Psychrometric Bin AnalysisBoulder, Colorado TMY3 Weather Data0.0360.0340.0320.0380ºF0.028Humidity Ratio (lb Water/lb Dry Air)0.0260.02470ºF60ºF0.0220.02Design Conditions (0.4%):91.2 db, 60.6 tive 2TMY3 Weather Data405060708090Dry-Bulb Temperature (ºF)1000Class 1 RecommendedRange110120Class 1 Allowable Range10

Estimated SavingsBaselineDX Cooling with no economizerSystem1 ton of cooling, constant year-roundLoad3Efficiency (COP)10,270Total Energy (kWh/yr)RECOMMENDED RANGEALLOWABLE RANGEHoursEnergy (kWh)HoursEnergy (kWh)Zone1: DX Cooling Only25821Zone2: Multistage Indirect Evap. DX (H80)261643Zone3: Multistage Indirect Evap. Only3100Zone4: Evap. Cooler Only8679751057Zone5: Evap. Cooler Economizer6055417165699Zone6: Economizer Only994040790Zone7: 100% Outside timated % Savings11

Energy Savings Potential: Economizer CoolingEnergy savings potential for recommendedenvelope, Stage 1: Economizer Cooling.12(Source: Billy Roberts, NREL)12

Energy Savings Potential:Economizer Direct Evaporative CoolingEnergy savings potential for recommended envelope,Stage 2: Economizer Direct Evap. Cooling.12(Source: Billy Roberts, NREL)13

Energy Savings Potential: Economizer DirectEvap. Multistage Indirect Evap. CoolingEnergy savings potential for recommendedenvelope, Stage 3: Economizer Direct Evap. Multistage Indirect Evap. Cooling.12(Source: Billy Roberts, NREL)14

Improve Air Management Typically, more aircirculated than required Air mixing and shortcircuiting leads to:– Low supply temperature– Low Delta T Use hot and cold aisles Improve isolation of hotand cold aisles– Reduce fan energy– Improve air-conditioningefficiency– Increase cooling capacitySource: bestpractices.pdfHot aisle/cold aisle configurationdecreases mixing of intake &exhaust air, promoting efficiency.15

Isolate Cold and Hot Aisles95-105ºF vs. 60-70ºF70-80ºF vs. 45-55ºFSource: bestpractices.pdf16

Adding Air Curtains for Hot/Cold IsolationPhoto used with permission from the National Snow and Ice Data Center. http://www.nrel.gov/docs/fy12osti/53939.pdf17

Move to Liquid Cooling Server fans are inefficient and noisy.– Liquid doors are an improvement but we can dobetter! Power densities are rising making componentlevel liquid cooling solutions more appropriate. Liquid benefits– Thermal stability, reduced component failures.– Better waste heat re-use options.– Warm water cooling, reduce/eliminatecondensation.– Provide cooling with higher temperature coolant. Eliminate expensive & inefficient chillers. Save wasted fan energy and use it forcomputing. Unlock your cores and overclock to increasethroughput!18

Liquid Cooling – OverviewWater and other liquids (dielectrics, glycols and refrigerants)may be used for heat removal. Liquids typically use LESS transport energy(14.36 Air to Water Horsepower ratio for example below). Liquid-to-liquid heat exchangers have closer approach temps thanLiquid-to-air (coils), yielding increased economizer hours.19

Liquid Cooling – New Considerations Air Cooling– Humidity– Fan failures– Air side economizers, particulates Liquid Cooling– pH & bacteria– Dissolved solids– Corrosion inhibitors, etc. When considering liquid-cooledsystems, insist that providers adhereto the latest ASHRAE water qualityspec or it could be costly.20

2011 ASHRAE Liquid Cooling Guidelines21

2011 ASHRAE Liquid Cooling GuidelinesNREL ESIF HPC (HP hardware) using 75 F supply, 113 F return –W4/W522

Courtesy of Henry Coles, Lawrence Berkeley National Laboratory

Three Cooling Device Categories2 - Row Cooler APC(2*)-water Liebert-refrigerant1 - Rack Cooler ercoolingwater3 - Passive Door Cooler terCourtesy of Henry Coles, Lawrence Berkeley National Laboratory24

“Chill-off 2” Evaluation of Close-coupledCooling Solutionsless energyuseCourtesy of Geoffrey Bell and Henry Coles, Lawrence Berkeley National Laboratory25

Cooling Takeaways Use a central plant (e.g., chiller/CRAHs vs. CRAC units) Use centralized controls on CRAC/CRAH units to preventsimultaneous humidifying and dehumidifying Move to liquid cooling (room, row, rack, chip) Consider VSDs on fans, pumps, chillers, and towers Use air- or water-side economizers Expand humidity range and improve humidity control (ordisconnect)26

Data Center Efficiency Metric Power Usage Effectiveness (PUE) is an industry standard datacenter efficiency metric. The ratio of power used or lost by data center facilityinfrastructure (pumps, lights, fans, conversions, UPS ) topower used by compute. Not perfect, some folks play games with it. 2011 survey estimates industry average is 1.8. Typical data center, half of power goes to things other thancompute capability.P.U.E. “IT power” “Facility power”“IT power”27

PUE – Simple and Effective28

Data Center oor Temperature ( F)1.45-20NREL PIX 178281.351.25PUE1.151.050.950.850.75100Data Center PUE80Outdoor Temperature6040200Outdoor Temperature ( F)1.45-2029

“I am re-using wasteheat from my datacenter on anotherpart of my site andmy PUE is nt/white-papers/ERE

“I am re-using wasteheat from my datacenter on anotherpart of my site andmy PUE is nt/white-papers/ERE

ASHRAE & friends (DOE, EPA,TGG, 7x24, etc.) do not allowreused energy in PUE & PUE isalways 1.0.Another metric has beendeveloped by The Green Grid ;ERE – Energy Reuse l/Content/white-papers/ERE

ERE – Adds Energy )PDUITReusedEnergy(g)(d)33

DOE/NREL Research Support Facility More than 1300 people in DOEoffice space on NREL’s campus 360,0,000 ft2 Design/build process withrequired energy goals̶ 25 kBtu/ft2̶ 50% energy savings̶ LEED Platinum Replicable̶ProcessTechnologiesCost̶̶ Site, source, carbon, cost ZEB:B̶ Includes plugs loads and datacenter Firm fixed price - 246/ft2construction cost (not including 27/ft2 for PV from PPA/ARRA) Opened June 10, 2010 (FirstPhase)Credit: Haselden Construction34

RSF Datacenter Fully containing hot aisle– Custom aisle floor and door seals– Ensure equipment designed forcoldaisle containment And installed to pull cold air–– Not hot air 1.18 annual PUEERE 0.9NREL PIX 25417 Control hot aisle based onreturn temperature of 90F. Waste heat used to heat building. Economizer and EvaporativeCooling Low fan energy design 1900 ft2Credit: Marjorie Schott/NREL35

Leveraged expertise in energy efficientbuildings to focus on showcase data centerNREL HPC Data CenterShowcase Facility 10MW, 10,000 ft2 Leverage favorable climate Use evaporative rathermechanical cooling. Waste heat captured and usedto heat labs & offices. World’s most energy efficientdata center, PUE 1.06! Lower CapEx and OpEx.Chips to bricks approachHighPerformanceComputing Petascale HPC Capability in 2013 20-year planning horizon̶̶5 to 6 HPC generations.Insight Center Scientific data visualization Collaboration and interaction.36

Critical Data Center Specs Warm water cooling, 75F (24C)̶Water much better working fluid than air- pumps trump fans.̶ Utilize high quality waste heat, 95F (35C)or warmer.̶ 90% IT heat load to liquid. High power distribution̶480VAC, Eliminate conversions. Think outside the box̶̶Don’t be satisfied with an energy efficientdata center nestled on campussurrounded by inefficient laboratory andoffice buildings.Innovate, integrate, optimize.Photo by Steve Hammond, NRELDashboards report instantaneous, seasonal, and cumulative PUE values.37

Water Considerations“We shouldn’t use evaporative cooling, water is scarce.” Thermoelectric power generation (coal, oil,natural gas and nuclear) consumes about 1.1gallon per kW hour, on average. This amounts to about 9.5M gallons per MWyear. We estimate about 2.0M gallons waterconsumed per MW year for on-site evaporativecooling at NREL. If chillers need 0.2MW per MW of HPC power,then chillers have an impact of 2.375M gallonsper year per MW. Actuals will depend on your site, but evap.cooling doesn’t necessarily result in a netincrease in water use.NREL PIX 1782838

Data Center Energy Efficiency ASHRAE 90.1 2011 requires economizer in most data centers. ASHRAE Standard 90.4P, Energy Standard for Data Centers andTelecommunications Buildings PURPOSE: To establish the minimum energy efficiency requirementsof data centers and telecommunications buildings for:– Design, construction, and a plan for operation and maintenance SCOPE: This standard applies to:– New, new additions, and modifications to data centers and telecommunicationsbuildings or portions thereof and their systems Will set minimum PUE based on climate More details at ard-scope39

Energy Conservation Measures1. Reduce the IT load - Virtualization & Consolidation (up to 80% reduction).2. Implement contained hot aisle and cold aisle layout.̶Curtains, equipment configuration, blank panels, cable entrance/exit ports,3. Install economizer (air or water) and evaporative cooling (direct or indirect).4. Raise discharge air temperature. Install VFD’s on all computer room airconditioning (CRAC) fans (if used) and network the controls.5. Reuse data center waste heat if possible.6. Raise the chilled water (if used) set-point.̶Increasing chiller water temperature by 1 F reduces chiller energy use by 1.4%7. Install high efficiency equipment including UPS, power supplies, etc.8. Move chilled water as close to server as possible (direct liquid cooling).9. Consider centralized high efficiency water cooled chiller plant̶Air-cooled 2.9 COP, water-cooled 7.8 COP.40

QUESTIONS?RSF II21 kBtu/ft2 246/ft2 Construction CostOtto VanGeet303.384.7369Otto.VanGeet@nrel.gov41

Data centers are energy intensive facilities. 10 more energy intensive than an -100x office Server racks well in excess of 30 kW Surging demand for data storage EPA estimate: 3% of U.S. electricity Power and cooling constraints in existing facilities. Data center inefficiency steals power that.