• TWeaK
    link
    English
    17 months ago

    It wouldn’t be critical if the cables were suitably rated for the specification. If you put a 0.5A cable in a 3A circuit, you’re gonna have a bad time. If you use a 3A or better cable, then you don’t need a cable chip to tell the actual devices to only work at 0.5A.

    • @anotherandrew@lemmy.mixdown.ca
      link
      fedilink
      English
      17 months ago

      How do you have the cable correctly identify itself if you don’t put some smarts in it? Or are you saying we should only be able to buy expensive cables fully rated for 100W (or higher as the spec has been updated) — and how do you prevent an older cable rated for 100W from being abused in a newer 200W circuit?

      Divider resistors are okay, but the IC is a better choice for future proofing and reliability.