Anthropocentrism Definition

The term Anthropocentrism refers to the belief that human beings are the most important species on the planet or the central element in the universe. Under that concept the assessment of reality would be done through an exclusively human perspective. In the art world, art from ancient Greece and Rome focused on depicting human features, proportions, etc. Humans being the central part of Art in Europe came to an end with the arrival of the Dark Ages and returned with the Renaissance.
Log In