For example: If the cable loss is 3dB, the factor is 0,5 (see table in the linked document) - so if I configured 50mW at the AP, only 25mW will be availabe at the antenna because of the cable loss, right?
Re: Antenna cable loss and increasing of the power level
You're correct about your calculation, and that data sheet is a very good document to read through to gain a solid understanding of wireless communication.
One thing to keep in mind - A 3dB loss does indeed knock your power in half (50mW to 25mW, for example). However, your coverage area isn't decreased in half. The reason we use dB in the world of wireless is because this accurately reflects how wireless signal propagates. If you have a coverage of 200 feet at 20dB, and you lose 3dB for any reason, your coverage will drop to 170 feet, not 100 feet.
This should explain why an AP's power settings generally go from 100mW to 50mW to 25mW, etc... One might wonder why on earth you'd ever configure an AP to operate at 4mW, a tiny fraction of the full power. In reality, this is closer to 30% power.
So again, you're correct to say that a 3dB loss can be countered by a 3dB increase in power at the source. Always try to think in terms of dB, not mW. If you think in terms of mW, you'll make errors when estimating coverage.