Imperialism is when a powerful country expands its control over other regions, either by colonizing them directly or dominating them politically and economically. Whether it was “bad” depends on perspective, but for the people being controlled, it usually meant loss of land, culture, freedom, and often exploitation or violence. Some argue it brought infrastructure or modernization, but that was mostly to benefit the empire, not the locals, so overall it’s widely seen as harmful.