Consider a half-wave dipole designed to work with signals 1 m wavelength, meaning the antenna would be approximately 50 cm across. If the element has a length-to-diameter ratio of 1000, it will have an inherent resistance of about 63 ohms. Using the appropriate transmission wire or balun, we match that resistance to ensure minimum signal loss. Feeding that antenna with a current of 1 ampere will require 63 volts of RF, and the antenna will radiate 63 watts (ignoring losses) of radio frequency power. Now consider the case when the antenna is fed a signal with a wavelength of 1.25 m; in this case the reflected current would arrive at the feed out-of-phase with the signal, causing the net current to drop while the voltage remains the same. Electrically this appears to be a very high impedance. The antenna and transmission line no longer have the same impedance, and the signal will be reflected back into the antenna, reducing output. This could be addressed by changing the matching system between the antenna and transmission line, but that solution only works well at the new design frequency.
Try to answer this question if possible (otherwise reply "unanswerable"): What effect occurs when the signal is reflected back into the antenna?
reducing output