Christianity remains the dominant religion in the Western World, where 70% are Christians. A 2011 Pew Research Center survey found that 76.2% of Europeans, 73.3% in Oceania, and about 86.0% in the Americas (90% in Latin America and 77.4% in North America) described themselves as Christians.
If it is possible to answer this question, answer it for me (else, reply "unanswerable"): Where does Christianity struggle to maintain dominance?
unanswerable