Discussion about this post

User's avatar
Joe Clark's avatar

Oh, you ‘ re so cute! You know that ‘ s not what was meant.

Expand full comment
dmytryl's avatar

Good point, perhaps I was unclear.

I mean the reason people say "it seems to be extremely unlikely" is because "viruses" (the prevailing variant from a throat swab or whatever) are pretty well conserved, because they more or less had exhausted all the most probable improvements mutations can make.

I see what you're saying - the patient gets infected, infects others outside of home while not yet ill.

The study also says that this was observed in children 0..11 months old. So the dose dependence may also be specific to maternal antibodies.

In any case, while it is plausible that there may be dose dependence in general, jumping to it as explanation for variolations seems very silly. They dried the scabs and let them sit for a while.

General rule of thumb for chemical reactions is something like 2x faster at every 10 K temperature increase.

So basically they would have had extreme variation in the number of still viable virions from one sample to next depending on temperature.

Also, the first truly "man made" vaccine for a viral disease, rabies vaccine, was made by drying neural tissue from rabbits. That one we know was a dead virus vaccine, because we kept making those all the way into 20th century (if not till now in countries that can't make a better vaccine).

So there's a far more plausible explanation that variolation worked like a badly made dead virus vaccine, with a bunch of live virus present / maybe a bunch of knock-out variants / etc. (And another highly plausible explanation that it used the less lethal strain, which we know had mortality rate similar to mortality rate of variolation)

Expand full comment
44 more comments...

No posts