(CDC) is the leading national public health institute of the United States. The CDC is a federal agency under the Department of Health and Human Services