I've been pondering this philosphical issue for a little while now, besides it's a welcome change to the 'America sucks' topics

This involves body alteration. When I was young, mostly drug addicts and sailors had tattoos. The majority of body piercings were restricted to the side show at the circus. Now they are both quite common, you run into them all the time and they are considered to be a normal way of expressing yourself and have sunk into mainstream life.

Here is the relevant thought. You guys out there, if it were as socially acceptable as getting a tattoo and carried as little social stigma as a piercing, and cost was not an issue, would you undergo plastic surgery to have breasts?

Think about it. Males in this world are truly fascinated by that female body part. Would you like to have a pair of your own, to look at or touch any time you wished?