With all the politics going on, I've been wondering whether or not people are willing to pay more taxes for greater benefits. Everywhere I go the only thing I hear from people are complaints about how much taxes the government is taking from them. On the contrary, what if the government was giving a lot back to you in return?
Maybe it's just me, but I certainly would not mind paying more taxes if things like health care and job security were guaranteed by the government. People think that the socializing of something like health care will not work in the United States but it seems to be working in MANY other democratic countries. Besides, if things like Social Security, Federal Reserve, and the National Education Association are socialized, why not health care?
All I'm trying to say is this; If I get sick or injured I just want to have the ability to walk or stumble into a hospital and get treated for free without having to worry about my life (which most hospitals don't care about). Another thing that bothers me is that when I get sick, I don't want to be at risk of losing my job. Are your thoughts different from mine? If so, I'd love to see what you have to say no matter where you're from.