healthcare (Noun) — The preservation of mental and physical health by preventing or treating illness through services offered by the health profession.
healthcare (Noun) — Social insurance for the ill and injured.