What booster said -- and keep connections (plugs and receptacles) to a minimum as well. Each set adds a little more r's (not so much when they are new.)
This is probably way too much info, but it may help you down the road:
The voltage drop is dependent on the amount of current flowing from your solar panel to your device. E (voltage) = Resistance X Current (in amps). 14 ga wire (copper) has a nominal resistance (from my chart) of .00257 ohms per foot. In a 30 foot run of 14 ga wire your resistance would be ~.077 ohms. 12 ga wire has a resistance of .0016 ohms per foot or ~.048 ohms in a 30 foot run.
If your device is consuming 1 amp of current, your voltage drop across 30 feet of 14 ga wire would be .077 volts (77 millivolts) E=1 amp X .077 ohms). With 12 ga wire it would decrease to a theoretical loss of .048 volts or 48 millivolts. You would have to determine if a savings of 29 millivolts (77-48 is worth the upgrade.
You didn't say how much current is flowing ("into the tool") or what you are powering, which is the key to the amount of voltage drop across a cable.
If your tool is using 15 amps (max for 14ga) your voltage drop is now 1.15 volts (E = 15 amps X .077ohms). For 30 feet of 12 ga, your loss at 15 amps is now .72 volts. This is now getting significant. (1.15V vs. .72V) There are other factors too: heat, number of wires in a bundle, stranded vs solid, type of conductor used etc.
Sorry, I tried to keep it simple. When in doubt, bigger is better.