One of the essential tools for achieving this goal is the concept of server single-rail reduced power output, or sometimes derating. In this article, we will delve into the world of derating factors, aiming to clarify their role in powersupply selection and data center cabling design.
First, let us define derating. In electrical engineering, the term refers to the practice of reducing maximum continuous output levels for a device below their rated limits, in order to prevent overheating, and prevent other performance issues. Server PSU manufacturers often incorporate derating into the design as a reliability feature, allowing the devices to operate within safe temperature ranges, guaranteeing the uptime and preventing any potential infrastructure failure.
The derating factor is the percentage of maximum rated power to the actual output capacity, اس اس آر in some cases expressed in percentage for easy visibility and comprehensibility. Derating can be categorized into three types:
- -Input line sequence voltage variation derating factor
- -Standard output derating
- -Optional internal deratic curves derating
댓글 달기 WYSIWYG 사용