Long winded, rambling post...
The move to wider plug gaps correlates with the move to electronic ignition. The ability to provide higher secondary voltages under all conditions permits reliable firing of the wider gaps under starting, idle and high speed conditions. Back in the days of points that was not true. Had a TR3 back in the day, and if the point gap was bigger than .014 (IIRC) the dwell was too short and the coil didn't have enough time to charge fully for anything over about 4000 RPM. The engine wild miss fire so badly it might as well have nad a governor on it. With electronic ignition the coil is saturated in much less time than that available at RPMs way above redline. (Not exactly correct technically, but I don't know how to better explain it.) Today we have motorcycle engines that routinely fire reliably at 12,000 RPM.
As far as resistance goes, it serves to limit the duration of the spark. The main functional reason is to reduce plug wear. (A side benefit is reduced RFI.) Excess resistance means more reduction of the spark duration. Eventually it will cause performance issues, but, apparently, combining the "recommended" resistor plugs with the standard resistor wires, at least if everything else is up to snuff, doesn't cause significant problems. On the other hand, there is no significant benefit, either.
In a nutshell, what happens is this: current flows through the coil primary winding. A magnetic field created by the current surrounds the coil. It takes a bit of time for the current, and the resulting magnetic field, to reach their maximum value, at which point the coil is "saturated." When time comes to fire the plug, the current in the primary winding is suddenly interrupted (points open in the old world). The magnetic field surrounding the primary coil collapses rapidly and this changing magnetic field induces a voltage in the secondary windings of the coil. This voltage is several thousand times higher than the voltage applied to the primary windings and could easily be as high as 50,000 volts were there no spark plugs. The voltage is conducted to the spark plug electrodes without loss as there is no current flowing at this point (Voltage loss = current times resistance; current is zero and so is the drop). The electrostatic field created in the spark plug gap by the increasing voltage difference between the two electrodes eventually causes the air in the plug gap to ionize. Ionized air is a much better conductor so current flows (and the voltage stops rising). This causes a voltage drop across the resistor wires and any resistor in the plug. That drop is proportional to the current times the total resistance. The drop quickly reaches a high enough value that there is insufficient voltage to sustain the spark and the plug stops firing. The time this takes depends also on the resistance provided by the plug gap and the ionized air. The bigger the plug gap the faster the drop will reach the point where the plug stops firing, so, at least performance wise, increasing the gap has the same effect as increasing the resistance, assuming you don't increase it to the point where the plug can't fire at all.
Side thoughts: Denser air is a better insulator, so increasing compression or supercharging both require higher voltages or smaller plug gaps to fire reliably. Again, though, modern electronic ignitions make this a moot point.
Increasing the plug gap (assuming nothing else changes) requires a higher voltage before the spark event occurs, and so has the effect of retarding the timing slightly. This could facilitate starting, but I doubt the effect would be large enough to cause a noticeable difference. In fact, I'd be surprised if it was even measurable (in degrees of crankshaft rotation).