In the United States and elsewhere in the world, there has been a marked rise in the evangelical wing of Protestant denominations, especially those that are more exclusively evangelical, and a corresponding decline in the mainstream liberal churches. In the post–World War I era, Liberal Christianity was on the rise, and a considerable number of seminaries held and taught from a liberal perspective as well. In the post–World War II era, the trend began to swing back towards the conservative camp in America's seminaries and church structures.

Where has there been a rise in evangelical Protestantism?