Healthcare is the maintenance and improvement of physical and mental health especially through the provision of medical services, but to many Americans it is so much more than a Webster definition. The big question is, do all American's have a "right" to healthcare?