imperialism (Noun) — The policy extending a nation's authority by territorial gain, or by the establishment of economic and political dominance over other nations.
imperialism (Noun) — A political orientation that advocates imperial interests.
imperialism (Noun) — Any instance of aggressive extension of authority.