In Brief... Religion Important to Americans
1 minute
read time
Religion appears to be more important in the United States than any other Western country.
A recent consumer marketing survey revealed that 54 percent of Americans say "religion plays an important part in my life." Other Western countries showed much smaller figures: The Netherlands, 25 percent; The United Kingdom, 19 percent; Germany and France, 14 percent (The Los Angeles Times, from AP, June 5, 1999).