I was just talking w/my girlfriend about this. I can't stress enough how important I think it is to change people's minds about healthcare. In every other industrialized nation (the good ones anyway) they provide their citizens with some sort of universal health care. It's thought of as...a right not a hand out. It's absurd that here in America people can go bankrupt AND DIE because our government (and big businesses) doesn't want to provide their citizens with healthcare. What is that all about? I just don't get it.
I know... some of the comments on that article I was astonished by. People can be so selfish. me, me, me.. Why should *I* have to....?I would gladly pay a few bucks in taxes if I could go to bed at night knowing that everyone had decent health care.
Post a Comment