The concept of the ‘magic bullet’ is intimately associated with the name of Paul Ehrlich, who applied it to the arsenical drug Salvarsan (discovered in 1909) that was the first effective treatment for syphilis. The phrase was taken to apply to toxic molecules that selectively target a specific organism, and as molecular pharmacology developed came to mean substances that bind to or inhibit a specific target molecule with high specificity. (Ehrlich was particularly interested in molecular selectivity, and received the 1908 Nobel Prize in Physiology or Medicine for other work in immunology.)

Now genetic studies in which we seek to relate the behaviour of genes to observable phenotypes are commonly performed in either a ‘forward’ or ‘reverse’ manner. In ‘forward’ genetics, we start by effecting random mutations and then screening for a phenotype of interest, after which we may seek the gene(s) responsible for the phenotype (a ‘function-first’ strategy). By contrast, in reverse genetics, we mutate a specific known gene and then detect the phenotype (‘gene-first’).

Traditional pharmacology was equivalently performed in a ‘forward’ or ‘top down’ manner, in which a drug (rather than a mutation) would be sought that had the desired effect in vivo (or at least in a physiological preparation or tissue). It would then be possible subsequently to determine its molecular mode(s) of action. This strategy is known in chemical genetics as forward chemical genetics. More recently, pharmacology and drug discovery have focussed on the ‘reverse’ approach in which we seek to inhibit a specific molecular target by screening many thousands or even millions of candidate drugs to develop ones with high potency, and when a suitable effector is found then seek to determine whether it does display the desired phenotype in the whole organism. It is more or less well known that this second approach, that became favoured following the systematic genome sequencing programmes, has not been overwhelmed by success, as huge numbers of candidate drugs are found wanting during the discovery pipeline, a phenomenon known as ‘attrition’. There are many reasons for this, some of which will be covered in later blogs, but one I shall focus on is the fact that very few networks or processes are blocked by the partial inhibition of just a single target.

Imagine a 4-step pathway A → B → C → D, in which we wish to inhibit the flux to D. We might try inhibiting the step between B and C, but even if this works initially, after a while adding more inhibitor is ineffective as the control of the flux will have shifted to another step (for instance blocking the B-to-C step may lower the conmcentration of C such that the C-to-D step is now much ‘slower’). In fact, metabolic control analysis (tutorial) tells us that every single step controls in part the flux to D. Theoretical and empirical observations, reviewed e.g. by Lehár and colleagues, by Zimmermann and colleagues, and by Hopkins have described how much more effective strategies for manipulating pathway fluxes and system behaviour using small molecules may be found when the molecules have multiple targets. We focus on the latter paper.

In his review of the subject, Hopkins notes the evidence that most biological networks are robust to inhibition at single points (one may argue that evolution might be expected to select for such network structures as they provide robustness against the effects of single mutations). This leads to the concept of polypharmacology, in which one seeks the specific binding of a compound to two or more molecular targets, or of cocktails of inhibitors that similar affect multiple targets. The question is ‘which ones should one pick’? This is of course a huge combinatorial optimisation problem. Thus, while empirical and knowledge-based approaches are useful, an especially valuable approach tests the effects of modifying models of biological networks in silico, as this is massively more convenient, cost-effective, and free of ethical considerations. In many cases, a comparatively small number of targets, but greater than one, is sufficient to have large effects under conditions when inhibiting just one has little effect.

If all this is true, one may suppose that existing successful drugs do in fact work via multiple targets, despite their initial selection on the basis of inhibiting just one. I shall blog at length about this on another occasion (meanwhile see the ‘iron review’), but there is abundant evidence for this in case of the pleiotropic and anti-inflammatory effects of both statins and glitazones. (Note that this is very different from the promiscuity commonly seen in hydrophobic drugs.)

A related approach, in which the (biotechnological) aim was to maximise a flux (in this case valine production), showed by analysing a metabolic model in silico that the direct expression of just three genes needed to be changed to exert a large improvement in productivity. This was beautifully confirmed experimentally by Sang-yup Lee and his colleagues. Similar analyses can clearly be applied to the biotechnological improvement of any biological organism or process, such as the development of processes for bioenergy, a subject whose Review Panel I chaired for BBSRC and on which we shall shortly be making a major announcement.

Overall, then, it is clear that the era of the magic bullet is coming to a close and that to make progress we need to improve our understanding of systems in a much broader sense; for chemical pharmacology and chemical genomics, it is time to bring on the magic buckshot.

Related posts (based on tags and chronology):