For most companies, the easiest ways to eliminate problems often go unnoticed. The load emerging technologies and IT infrastructure deployment, the old technologies are often not taken into account in pursuing a solution. power consumption may fall into this category as a simple yet neglected area of improvement.
It 's a matter of fact, "energy" has become an industrial sense, and there are some very simple, easy to implement strategies that could lead to an immediatesavings for each company. None of these methods is based on a newly introduced technology. Here are five simple strategies to reduce energy consumption in data centers.
1. Hot Aisle / Cold Aisle If your computer data center is not configured correctly so hot / cold aisle layout should be. This layout of the machine ensures a good flow of hot and cold air resulting in less work required by the HVAC system cooling and computerequipment. In almost every layout, rack servers are against each other in pairs, with the back of the back of one server to another server in a couple of racks of servers in the datacenter. Data Center computer equipment is built to bring fresh air in front of the machine and heated to expel the air from the rear of the machine. Set the racks face each other in pairs facilitates the flow of cooler airon the front of both the server racks and the flow of hot air between the back of two server racks, resulting in alternating "cold" aisles "hot" aisles. Note: All data centers use similar principles to rack available, but some differences in the degree of exactly where the computer room air conditioning (CRAC) units are located. To optimize the flow of hot air back into HVAC ducts, the servers should be placed at 90 degree angles to the CRACunits. Positioning of the machines in this way, the warmer air unhindered return channels. Make sure that the hot air with the force from hot to cold corridor of travel corridors, as this will result in heating the cold aisles. Also note there is technology available to simulate the airflow in and out data centers that can be used before you actually install the equipment. The current layout hot aisle / cold use the properties of the cooler and warmer to reduce workrequired by other mechanisms that rely on power to get the job done. Each warm and cold air can mix at a time, more work is required to consume CRAC units of computer power and cooling mechanisms. Computer equipment are not designed for the air to keep cool from the front and rear disc to be placed away from the rack that are promoting the flow inherent in a hot aisle / cold aisle layout. This non-compliant devices must be designed in aso that the hot exhaust air directly to a hot aisle or be placed in cabinets can bend the top or side exhaust fan in the back of the rack.
2. The correct temperature and humidity at the right temperature and humidity inside the data center is essential for a good flow of air in the room. Too often there is a misconception that the server room, monitoring should be cold and little is done to maintain the optimal temperature and humiditylevels. Many times the only device sensor in the room is that the thermostat. Server room should be kept cool, but it really does not need to be colder than average room temperature throughout the site. The temperature range recommended for the central data room is always between 67 and 72 degrees. Too often, companies cool their data centers up to 65 degrees. It's okay to give the room a few degrees hotter, the computer equipment is still activewithin the recommended temperature and optimal.
Ie a 4% reduction in energy consumption of data centers can be provided for every degree the temperature is kept warmer. Of course, the devices are within range of acceptable temperature and never more than 75 degrees, even if the hardware specifications, the rate of equipment to operate at temperatures up to 95 degrees. Machines outside the acceptable temperature and faster consumptioncarries a risk of overheating due to any downtime. A further problem arises in keeping a server room cooling than the rooms near greater humidity.
The recommended data center humidity between 45% and 55%. The cooling air too much can increase the humidity above acceptable levels. Once this happens, for the collection of condensation inside the sensitive equipment needed and result in a hardware failure. Without the right sensorsinstead of temperature and humidity in the data center to detect, none of these properties of air are controlled. These should be placed throughout the room to ensure that all devices within acceptable temperature and humidity. They know that there is a significant difference between conventional air conditioners and CRAC. Expected energy savings and longer life with a CRAC unit installed.
3. The cold tile floor good course should includeperforated tiles or grates on the floor to facilitate the flow of cooler air from the inlet to the server. The placement of the perforations, thereby increasing the cold floor in the corridors, using the intrinsic properties of the air cooler reduces cooling mechanisms the workload of the HVAC system and the computer. corridors of hot air should not include this drilling floor so the flow of hot air to return air ducts is Fri Any Timeequipment is moved in the data center, changing times in the cold aisles also be moved to the free flow of cooling air and heating to maintain the maximum.
4. Bypass the bypass air air, no air conditioning in the data center that is used by computer equipment air intakes. This inefficiency results in airflow in the data center layout. cooler air is promoted to the ground, increasing the use of perforated tiles or gratesthe cold aisle. Air to prevent an increase in the floor for any other purpose. Bypass air is often the result of anomalies in the data center floor. These can be holes in the floor for power cords or network to leave the room or broken tiles as possible. Look for these areas on the floor server and, if possible, be sealed to keep air bypass to a minimum. Anytime your computer is moved to the room, you should check for any defects in the news exposed floors and repair. Look for air to bypass the holes on the back of the server cabinet. Every time I leave the wiring back of the server, the niche properly sealed to prevent air to move. Another area in which the air bypass can occur around the door of the datacenter. To ensure that this input is tightly sealed.
5. Shearing Finally, the current hot / cold aisle configuration, cover panels must be placed in the server> Racks when not machines. If the spaces are in rack servers, the gaps will be hot exhaust air back to the cold aisle, reducing the efficiency of the entire configuration. These panels simply cover the holes on the front of the shelves in the air to block the mixing of hot and cold. This simple solution maximizes the power saving features of your running hot / cold aisle layout of the data.
Five key questionsyourself:
1. The reduction of energy consumption in data centers face as a way to cut business costs? Too often, a data center has grown with the ', but its energy needs. Energy consumption in the data center must be optimized.
2. It is the data center from the right aisle hot / cold aisle configuration? Computer equipment in many services lined up in front of a rack on the back ofanother. This is an inefficient allocation mechanism that increases the expenditure of labor is the HVAC system and cooling inside the computer.
3. The temperature in the data center to the bottom end of the acceptable range of data center equipment? A common misconception is that the server room must be cold. The truth is that data centers need to be a bit 'hot and humid, but not too wet.
4. There is no air in the room of a banthe natural flow of the configuration process running hot and cold? Any holes around electrical lines or cables on the same shelves must be sealed to ensure the free flow of air, enlarged.
5. blanking panels in place are sufficient to ensure that the warmer air from mixing with colder air? cladding panels are simple, inexpensive devices that prevent the mixing of hot and cold air. The implementation of these defenses simply the energy consumption of data centers will result in an immediatecosts. This saving is significant for a company and these solutions are feasible at very low cost. The necessary equipment is only the purchase of air monitoring sensors and blanking panels, and both devices are very cheap.
Energy costs are planned to continue to grow in the near future. Electrical networks operating at maximum capacity and the environment are to prevent the future construction of power plants. Achieve maximum energy efficiencyin your business is essential to avoid unnecessary costs associated with energy consumption of data centers. Optimization is the key in your data center, so as not to lose sight of the obvious.
No comments:
Post a Comment