While the United States is at this time a seething cauldron of discussion relative to social changes in the administration of medical service, the subject is of equal, if not greater, concern in Great Britain and attracts constant consideration in most other countries.
During the war Nazi Germany saw the National Socialist State medicine become an instrument of fascist domestic and foreign policy. According to Rosen,1 physicians had to accept and further the principles of National Socialism. Research was directed toward justifying National Socialistic racial doctrines. Many health insurance institutions were suppressed. Nature healing was officially recognized. In the ensuing years medicine in Germany deteriorated, became detached from scientific progress and lost such leadership as it had previously had in medical science of the world.
A survey of medical education in Western Germany during 19482 calls attention to the fact that purges of faculty members by the Nazi