Germans colonized the land of Namibia, in southern Africa, during a brief period of time, from 1840 to the end of the World War I. The story of the so-called German South West Africa (1884-1915) is hideous; a hidden and silenced account of looting and genocide.
April 22, 2019
Released
Unter Herrenmenschen — Der deutsche Kolonialismus in Namibia
53min
—
—
English, German