American Art Definition

American art generally refers to art hailing from the  North American colonies and of the United States. Although the term can denote a wide variety of styles of and methods of art, perhaps the most recognizable and iconic American art from is the painting of realistic portraits and landscapes.  
Log In