![>](style_images/Executiv-909/nav_m.gif)
Women Viewed Sexually
Humans have lived for centuries in what seems to be a male dominated society. Modernization has caused a lot of this to change, but not everywhere. Do you think the Western World has gone beyond seeing women mostly as a 'sexual assets' and as equal to man or even superior? What about the Eastern World?