For instance, if a transmitter delivers 100 W into an antenna having an efficiency of 80%, then the antenna will radiate 80 W as radio waves and produce 20 W of heat. In order to radiate 100 W of power, one would need to use a transmitter capable of supplying 125 W to the antenna. Note that antenna efficiency is a separate issue from impedance matching, which may also reduce the amount of power radiated using a given transmitter. If an SWR meter reads 150 W of incident power and 50 W of reflected power, that means that 100 W have actually been absorbed by the antenna (ignoring transmission line losses). How much of that power has actually been radiated cannot be directly determined through electrical measurements at (or before) the antenna terminals, but would require (for instance) careful measurement of field strength. Fortunately the loss resistance of antenna conductors such as aluminum rods can be calculated and the efficiency of an antenna using such materials predicted.

What would need to be placed into the transmitter to create ten W oh warmth?