Doctors work for their money just like every other person. They don't just look at you and give you a pill.They give you what you need to make you better. They make it for you to live your life longer and heather.
People should get help no mater how much money they have.
If america give free health care.There would be more people able to get the help they need with out paying a lot of money.
View tutorial
Change profile picture
Back to the action!