That's an informative video, but unless PCIe power specs have changed, the mobo PCIe socket is carrying the first 75watts of that GPU load, so even shaving 75watts off the 660watt cable rating, the cable under normal circumstances should still be able to take the 525watt maximum, across the 6 cable pairs, but if he's reading +20amps, it means one single cable pair is doing +240watts of that possible 400watts cable power draw.(IIRC he said system draw was 575watts, system at 100watts, and PCIE power at 75watts).
Despite Molex cables saying gold with nickel was good, the problem with gold in that scenario is that it is a softer metal and is less heat resistant, so conductivity on those first set of pins at 100deg C - which it should be nowhere near - reduces conductivity rapidly causing resistance to current, causing even more power draw, in a cyclical trap, and on every use the first pairing getting that hot is causing the thermal and electrical shielding on the first pair to degrade increasing electrical interference. How this has got through testing and into a product is shocking, and downright dangerous, there should be 400watt power limit per PC component, unless we are taking the foreman grill with kettle design into the PC world.
Zero common sense on display from the world's richest company designing GPUs, who could have guess it?