Here is a simple(?
) explanation of temp effect on coaxial cable attenuation.
Coax cable attenuation/loss will change 0.1% per degree F. change.
The attenuation is usually spec'd by the vendors at 68deg F. So a cable having a 1.43db loss/100ft @500mhz at 68deg F will have a loss of 1.47db/100ft @500mhz at 99deg F.
This is just at air temp. A more realistic upper temp would be 120-150 deg F since the OH cable is in the direct sunlight. Using 128deg F, that same 100ft piece of cable now has 1.51db loss @500mhz.
Still not so much of a change right? Remember there are probably many miles of cable between you and the source. Using 2 miles, the total cable loss from the source to you goes from 151db @500mhz up to 159db @500mhz. That is an 8db decrease in available signal. Could definitely be enough to affect marginal levels.
The above is real simplistic just to show what the temp can do. Actual plant will have many more variables.
As far as only the shortest outlet having a problem when fed from a particular port on the splitter? [shrug] Cables and splitters/ports do go 'bad'. Let an onsite tech with a meter figure out what to do. BTW, would someone double check the math...... --
one should not increase, beyond what is necessary, the number of entities required to explain anything