Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
But then if I stop in store afterwards I get treated like dirt by other women. To be honest I feel makeup on women is more to get acceptance from other women. Heaven forbid you don't wear it to a doctor appointment or job interview or else you get treated like a low-life by the staff.
Where do you live, that other women notice, or care? What part of the country is like this?
These days, make-up is considered a requirement for certain high-profile jobs. Not for ordinary office jobs, though.
I've seen before-and-after photos in magazines of women with and without makeup, and I always think the "before" photos show the women as much more personable and approachable. Their personality and warmth shows more. The mask masks that, and seems to put up a subtle barrier between her and the world.
No one in my neighborhood, adult or teen, nor in my extended family, ever wore make-up, except lipstick for special occasions. It wasn't part of the culture. Make-up for mass consumption didn't exist until around the 50's sometime, when Max Factor adapted Hollywood make-up supplies for the general consumer. It took awhile to catch on in some circles. It wasn't approved of, it was considered more a tool of the trade of women who sell themselves, there was a stigma. California beach culture was more about a healthy natural look, and a tan (not too healthy, though no one knew at the time).
Of course they are going to select women who don't need it for a magazine photoshoot. Or it could have possibly been airbrushing. But there are women who suffer from discolorations from acne blemishes, dark under eye circles and other things. Wearing makeup improves the evenness of their overall skintone. No one should have to walk around with imperfections like that if they don't want to because of some man's prejudice against makeup.
Where do you live, that other women notice, or care? What part of the country is like this?
These days, make-up is considered a requirement for certain high-profile jobs. Not for ordinary office jobs, though.
Not true... I was the parts manager for an aircraft repair station, worked with 11 men and we weren't open to the public. I actually had my boss tell me that I didn't look like I took the job seriously when I didn't have my hair and makeup done.
Even as a mechanic I was told I should 'spruce up' my looks with hair and makeup and actually had a fellow mechanic tell me in front of an entire store meeting that I looked horrible and ugly when I showed up without makeup after I took a quick shower.
So yeah, even in the trades women are still supposed to do their hair and makeup, which makes perfect sense when I'm covered in grease, elbow deep in an engine. Gotta look pretty!!
No you are not the only man who prefers women who don't wear makeup. Check the stats on the poll at this webpage:
[url=http://www.midlifebachelor.com/articles/womenneedmakeup1.html]Do Women Really Need to Wear Makeup[/url]
Make up is a turn off for a lot of men. Obviously there's a reason we're saying this. The fact that we don't read Cosmo doesn't make us stupid. It could be that the reason we're saying we don't like makeup is that we actually just don't like makeup.
All makeup does is cover natural beauty, and make a woman look like she's wearing makeup. Most women look better without it.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.