Recently I fell sick, and since it was the first time I had to take care of myself, I quickly realized that I didn't know what to do. Should I take fever-reducing medicine or not? Do I just have a cold or something worse? How can I tell? Will it just pass with time or will it just get worse?
Conventional wisdom says that when in doubt just go and see a doctor, but that can't be practical, and plus it's very expensive. We may leave the real medicine for doctors, but we shouldn't be so helpless that we have to rely on them for everything. It strikes me as strange that we have a system of societal organization where the knowledge most essential to our own well-being is also the most outsourced and divorced from our control. I feel the complete asymmetry of information in the doctor-patient relationship is dangerous, because a clueless patient has no way of defending herself against an insincere, mistaken, or scheming doctor.
That's why I think it would make a lot of sense to teach medicine in K-12 schools. Why not? After all, it's the most practical thing to learn. Teachers spend all of elementary school trying to convince us that math is useful, but the benefit of learning medicine is self-apparent. If there's any guarantee in life it's that every man, woman, and child is going to fall sick.
We already offer general health courses here and there that educate mostly about nutrition and STDs. This can just be another component. There's no need to pass the MCATs to be able to understand that a viral infections go away with time whereas bacterial ones don't; or that a nice hot, ginger tea works well for clearing out mucus and wet coughs.