Physicians in the United States are doctors that practice medicine for the human body. They are an important part of health care in the United States. The vast majority of physicians in the US have a Doctorate of Medicine (MD) degree, though some have a DO or MBBS. read more