Advertisement
Advertisement
The Occident
- Term referring originally to Europe but now including North America and South America as well. Occident means “the West,” as opposed to Orient, “the East.”
Advertisement
Advertisement
Advertisement
Advertisement
Browse