Colonialism and Gender

Colonialism majorly refers to the state of political and economic dominion of a country by another more powerful country. Gender, in this context, depicts the plight of women as caused by colonialism, which caused the gender imbalance making women more vulnerable. Patriarchal authority was rampant in the colonial era. The authorities looked down on the subordinate women in the colonies and often took advantage of their situation. The economic revolution in the colonial era undermined women’s status and reduced their political role.