I am from America, so western society, yes. And while many European nations demonized wolves for centuries, which has also followed into American culture, after wolves were placed on the Endangered Species Act I've found that many Americans romanticize the species now instead. Wolves, along with tigers, are like the poster species for endangered animals and so from that, along with popular New Age stuff, have been romanticized as far ad I've seen.
no subject