- Dustin Lance
- Blog
What Is Informed Consent?
Informed consent is the legal concept that requires doctors to disclose and adequately explain pertinent information to a patient so that the patient can make a voluntary and informed choice to accept or refuse treatment. Legally and ethically, patients have the right to make decisions about what is done to their bodies, and physicians have a duty to adequately inform patients about the risks and benefits of medical procedures, drugs, therapies, etc., and to obtain the patient’s consent to the proposed treatment after adequately informing them.