Is PoE more efficient than plugging in adapters for each network device?

And at what scale does it start to matter?

From my perspective; I’m going for a 3 node mesh router, plus 2 switches, and was considering if in 5 years time the electricity difference would be less than the extra upfront cost. The absolute max length of cable would probably be around 30m

  • Max-P@lemmy.max-p.me
    link
    fedilink
    arrow-up
    5
    ·
    9 months ago

    The switch can put out 15.4W, but it doesn’t control how much power flows. The device can draw 15.4W if it wants to but it won’t necessarily do so. The switch can lower the voltage it supplies, and it can cap the power output by lowering the voltage it supplies, but it can’t push a certain amount of power. That would violate the fundamental physics of electronics.

    Put a 2.4kΩ resistor as the “device”, and at 48V, the absolute maximum that will flow is ~1W. The switch would have to push 196V to force that resistor to use 15.4W which would put it way out of spec. And there’s nothing preventing the device from being smart enough to adjust that resistance either to maintain 1W. That’s basic Ohms law.

    The device must negotiate if it’s going to use more than the default 15.4W, or it can advertise it’s low power so the switch can allocate the power budget to other devices as needed. But the switch can only act as a limiter, it can’t provide more than the device takes. It can have the ability to provide more than the device takes, but simply can’t force the device to take more.