Menu
Imperialism
IMPERIALISM
Discover imperialisM
im·pe·ri·al·ism
im'pirēəˌlizəm/
noun
:
a policy of extending a country's power and influence through diplomacy or military force.
The definition of imperialism in the dictionary does not fully describe the effects it has had on many countries around the world.
To find out more, click the blue link above
Website Creator: Hannah Selken
Published on 1/4/15
All images found on this website were found on Google Images.