자유게시판

Understanding Power Supply Efficiency

페이지 정보

profile_image
작성자 Reed
댓글 0건 조회 4회 작성일 25-05-15 21:50

본문

when choosing a server power supply unit (psu) for a data center or a server rack, one of the crucial considerations is to ensure the power supply can handle its maximum(normal) power demand during peak usage hours.
One of the essential tools for achieving this goal is the concept of server single-rail derating factor, or sometimes output restriction. In this article, we will delve into the realm of derating factors, aiming to clarify their role in powersupply selection and رله الکترونیکی data center cabling design.

First, let us examine the practice. In electrical engineering, the term refers to the practice of reducing maximum operational output levels for a device below their rated limits, in order to prevent overheating, and prevent other failure issues. Server PSU manufacturers often incorporate derating into the design as a reliability feature, allowing the devices to operate within safe temperature ranges, guaranteeing the uptime and preventing any potential system failure.


The derating factor is the ratio of maximum rated power to the actual output capacity, in some cases expressed in decimal for easy visibility and comprehensibility. Derating can be categorized into three types:

  1. -Input line sequence voltage variation derating factor
  2. -Standard output derating
  3. -Optional internal deratic curves derating

A common derating factor found in server PSU specifications is the "Design temperature limit" value which tells you how much of the PSU's capacity is available at design temperature. Let us suppose the PSU's rated current is 24 amps and we know the derating factor at 273 Kelvin or 0 degrees Celsius is approximately 89: that means that at zero degrees the PSU has an limiting output of 22.256 amperes times the PSU's rated voltage. Its maximum rated capacity at any temperature.

댓글목록

등록된 댓글이 없습니다.

Copyright 2019 © HTTP://ety.kr