Do Drugs have to be FDA approved?
Answers
Answered by
1
Explanation:
FDA Approval is Required by Law
Federal law requires all new drugs in the U.S. be shown to be safe and effective for their intended use prior to marketing. However, some drugs are available in the U.S. even though they have never received the required FDA approval.
mark me as brainliest answer
Similar questions