Most agree that the Affordable Care Act, commonly known as Obamacare, has fundamentally changed the health insurance landscape in the United States, but the health-care law is also quietly causing a sea change in the way hospitals and doctors treat patients and do business. “The entire industry is really doing a paradigm shift,” said Terri Thompson, vice president of population health for ProMedica.
Story Date:
Agosto 31, 2014
Source Link:
Read More [1]