A wire that is heating up is not properly sized for its application. If there is heat, there is voltage drop. Voltage drop = amperes x (total resistance of the length of cable in ohms). If you used a 20 gauge single wire cable to transfer 100watts at 12 volts (or 8.3 amperes) 2 feet you would have a voltage drop of 1.4%. (FYI: The recommended maximum drop in voltage is 2%.) An 8 gauge single wire cable doing the same job would have a voltage drop of 0.0089%.
Granted an 8 gauge single wire cable is certainly overkill but, it was just an example to show that the larger wire has less voltage drop.