You don’t need it though. The power regulation is a decision between the load and the supply devices, the cable is an unnecessary third party. The cable should just be a multicore connection between two things, not a third device.
If I had to go out on a limb though, I’d say it’s because manufacturers were selling cheap cables that didn’t meet the specification, and people were using them with higher power devices, causing overheating. By including a chip in the spec for the cable, you can push some of the responsibility back towards the cable manufacturer, and they can limit the maximum current to whatever they’ve designed to. In which case, we already do have different cables for different voltages - if your cable isn’t rated for 100W, then it might force a lower power even if your device and charger can do 100W. However it would be better if cable manufacturers would just meet the basic design specification to begin with, rather than creating unnecessary overhead.
The cable has to carry the negotiated power safely. It’s not unnecessary, it’s absolutely critical. I’ve personally seen and diagnosed the result of when this fails.
For your low power applications there is no need and the spec allows for that.
It wouldn’t be critical if the cables were suitably rated for the specification. If you put a 0.5A cable in a 3A circuit, you’re gonna have a bad time. If you use a 3A or better cable, then you don’t need a cable chip to tell the actual devices to only work at 0.5A.
How do you have the cable correctly identify itself if you don’t put some smarts in it? Or are you saying we should only be able to buy expensive cables fully rated for 100W (or higher as the spec has been updated) — and how do you prevent an older cable rated for 100W from being abused in a newer 200W circuit?
Divider resistors are okay, but the IC is a better choice for future proofing and reliability.
It doesn’t make any difference either it’s between the supply and the device or it’s between the cable and the device it’s still two devices.
By pushing the responsibility onto the cable it allows you to operate the cable directly from a USB port. So you can have things like electrical sockets with USB connections and you don’t have to have chips in the sockets, because typically they’re just dumb electrical interfaces. It also means that the device delivering the power doesn’t have to be actually fully switched on, so you can recharge your phone from a USB port on your computer and you don’t have to power the computer on. As long as there is an open electrical channel to the port the cable will deal with it all itself.
Also it’s more efficient because you would have to have a control circuit in every single power delivery device, but this way you can have it in just the one cable, so now it is one chip for an unlimited number of power delivery devices.
So you can have things like electrical sockets with USB connections and you don’t have to have chips in the sockets, because typically they’re just dumb electrical interfaces.
If the supply is dumb and cannot negotiate power, then there is no need to negotiate power and it will fall back on regular 5V USB. The same if the load is dumb. In this case, there is no need for a cable chip.
It also means that the device delivering the power doesn’t have to be actually fully switched on, so you can recharge your phone from a USB port on your computer and you don’t have to power the computer on.
If the USB port has power to it, the computer is supplying it. The voltage would be on but open circuit. The computer would not have to supply the negotiation circuitry until a cable has been connected end to end and the voltage circuit is closed.
You’re trying to present this as the cable replacing one of the devices, but it doesn’t, it’s an extra 3rd device in the negotiation. All 3 devices must permit a certain charging level for that level to be used. It may have some benefit in ensuring that cable load capacity isn’t exceeded, but like I say it would be far better if the cables were reliably manufactured properly to handle the specified loads.
You don’t need it though. The power regulation is a decision between the load and the supply devices, the cable is an unnecessary third party. The cable should just be a multicore connection between two things, not a third device.
If I had to go out on a limb though, I’d say it’s because manufacturers were selling cheap cables that didn’t meet the specification, and people were using them with higher power devices, causing overheating. By including a chip in the spec for the cable, you can push some of the responsibility back towards the cable manufacturer, and they can limit the maximum current to whatever they’ve designed to. In which case, we already do have different cables for different voltages - if your cable isn’t rated for 100W, then it might force a lower power even if your device and charger can do 100W. However it would be better if cable manufacturers would just meet the basic design specification to begin with, rather than creating unnecessary overhead.
The cable has to carry the negotiated power safely. It’s not unnecessary, it’s absolutely critical. I’ve personally seen and diagnosed the result of when this fails.
For your low power applications there is no need and the spec allows for that.
It wouldn’t be critical if the cables were suitably rated for the specification. If you put a 0.5A cable in a 3A circuit, you’re gonna have a bad time. If you use a 3A or better cable, then you don’t need a cable chip to tell the actual devices to only work at 0.5A.
How do you have the cable correctly identify itself if you don’t put some smarts in it? Or are you saying we should only be able to buy expensive cables fully rated for 100W (or higher as the spec has been updated) — and how do you prevent an older cable rated for 100W from being abused in a newer 200W circuit?
Divider resistors are okay, but the IC is a better choice for future proofing and reliability.
It doesn’t make any difference either it’s between the supply and the device or it’s between the cable and the device it’s still two devices.
By pushing the responsibility onto the cable it allows you to operate the cable directly from a USB port. So you can have things like electrical sockets with USB connections and you don’t have to have chips in the sockets, because typically they’re just dumb electrical interfaces. It also means that the device delivering the power doesn’t have to be actually fully switched on, so you can recharge your phone from a USB port on your computer and you don’t have to power the computer on. As long as there is an open electrical channel to the port the cable will deal with it all itself.
Also it’s more efficient because you would have to have a control circuit in every single power delivery device, but this way you can have it in just the one cable, so now it is one chip for an unlimited number of power delivery devices.
If the supply is dumb and cannot negotiate power, then there is no need to negotiate power and it will fall back on regular 5V USB. The same if the load is dumb. In this case, there is no need for a cable chip.
If the USB port has power to it, the computer is supplying it. The voltage would be on but open circuit. The computer would not have to supply the negotiation circuitry until a cable has been connected end to end and the voltage circuit is closed.
You’re trying to present this as the cable replacing one of the devices, but it doesn’t, it’s an extra 3rd device in the negotiation. All 3 devices must permit a certain charging level for that level to be used. It may have some benefit in ensuring that cable load capacity isn’t exceeded, but like I say it would be far better if the cables were reliably manufactured properly to handle the specified loads.