Gender Roles
Gender roles are the roles that society assigns to men and women
based on their gender. They especially influence relationships between
men and women.
Gender roles have been changing in Western society in recent decades,
and generally have become more flexible. However, traditional gender
roles still have some influence.
For example, it used to be expected that men would experiment
sexually before marriage, but that women would not. Women who went
against this expectation were considered "loose" or "fallen" women,
while men who went against the expectation were considered less than
manly.
Years ago another expectation was that women were supposed to get
married and stay home to raise a family. The man was expected to go out
to work to support his family.
If the woman chose to have a career, she was considered "barren" or
"lacking in maternal instinct", and her partner was often considered
inadequate, as it was assumed he was not a "good provider Obviously,
things have changed to some degree. Today there is more sharing of
family and household responsibilities, and both males and females are
working in less traditional careers, e.g. we have both male and female
nurses, firefighters, engineers, dentists, etc.
In the sexual arena, however, things are still not as flexible as
they might be. Many people still expect the man to take responsibility
for initiating and ending sexual activity and for carrying a condom.
Women are expected to be less sexually aggressive than men, even though
we know that women's sexual feeling can be just as strong as men's.
How do gender role expectations affect you? Do you feel free to be
passive if you're a male or aggressive if you're a female? Do you feel
comfortable bringing out a condom? Do you feel there are some things men
or women shouldn't do sexually? |