Tuesday, July 20, 2010
Gender Roles
There is no doubt that the U.S. in a male dominate society, or at least a male run society. The government is run by men and most of corporate America is run by men. With that said, there is a definite place for women. Women have become a vital part of the corporate scene. Women bring a different mentality to the way business is done. As for the roles at home, the traditional line is being blurred. Women are going to work while the men stay at home with the children. The tradition of the homemaking woman is no longer an expectation. The idea of a man staying home with the kids would seem like a joke in the 50's, but now it is a part of American life. The roles of the man and the woman are really no different any more. I have never been in a situation where women where paid less but I am sure that sometimes happens. The gender roles are no longer divided like they once were. Honestly, it is hard to write about because I don't see a line between what men and women do. Even the sex roles are no longer defined. There has been a rise in a the awareness and acceptability of homosexuality. So now, sex has become less of a taboo and one cannot be defined by that either. Sure, there is still an underling prejudice toward women at home but it is becoming less and less an issue.
Subscribe to:
Post Comments (Atom)
i agree that gender roles have changed alot through recient times. as we progress toward a more equal grounds between men and woman many more things will change. the corprut world is still domanited by me but woman are starting to rise to the positions of power. it will simply take time. it has only been reacently that woman have had opertunity to rise to the top positions and it will not be to long before woman use these oppertunities to get the education and carior recourds to achive many more of these positions.
ReplyDelete