Org Prep Daily

September 19, 2007

On Combichem

Filed under: industry life — milkshake @ 6:05 pm

A single failed reaction is a setback. 10 000 failed reactions is a library.


After two combichem industry jobs, these are the things I learned:

Testing cruds = We are sloppy
Testing mixtures of compounds = … but very optimistic
‘Libraries from libraries’ = Our slop got squared
On-resin-screening = I wished for this to work
One bead-one compound = …but it did not
Million-compound library = …so we pushed harder

What I ended up doing most of my time in combichem was making the precursors and building blocks, attaching them to the resin, deprotecting, loading the resin-bond compounds to the synthesizer or plates or syringes, then running one, maximum two combinatorial steps, then the cleavage, workup, purification. Lots of washing at every step. The preparatory and workup steps took more work then the actual combichem. (The purification was definitely the most tedious step, even with preparative HPLC.)  It was an experience that made me increasingly fond of the traditional medicinal chemistry.

One needs to be pretty cautious, about doing a medchem research in the combinatorial fashion. Even if most reactions could be adapted to combichem in principle, this usually means lots of development time. It is worth doing in limited number cases when:

1. The reactions are very clean
2. Lots of building blocks are available, preferably in a suitably-protected form
3. The set-up does not require a low temperature or strictly oxygen-free conditions
4. The chemistry is insensitive to reagent excess, goes to completion and won’t “over-react”when pushed hard (= turn into mess  because the optimum reacton time at given temperature was grossly exceeded).

So the best type of reactions for combichem is the robust kind “that a zombie can do”.  For example using a reaction that is highly moisture-sensitive or prone to overheating-induced decomposition (i.e. Mitsunobu) adds significant difficulties. Titrating the stoechiometry to 1.05 equivs of the reagent is not possible, just as optimizing the reaction conditions for every single building block. Adapting chemistry to combichem format grows exponentially harder if one is trying to synthesize a very large library at once. (Just preparing the reagent stocks for a large library is a chore. There are handling delays, etc).

The other problem with combichem is that it encourages wishful thinking. One needs to be concerned about the building block combinations that do not work well. Failure is frequent if one tries to employ a wide-diversity set of building blocks - even straightforward  reactions like acylation and alkylation (with the excess of nucleophile) don’t progress or they produce mixtures unexpectedly in a substantial number of cases. To avoid bad surprises, one needs to experiment on smaller pilot libraries first – to test the suitability of the blocks, test the best solid-phase attachment, solvents, reaction conditions, cleaving method, purification sequence, etc.

Also one should not fool himself that an automated synthesizer will be a great time-saver - these systems are very impressive in showroom, pumping blue-colored water-  but keeping them up and running is a full time job, the gaskets can leak, the pipetting needle and the outlet channels get clogged, viscous solutions or stocks in highly-volatile solvents won’t get transferred in the correct amounts, the heating/cooling or nitrogen gas flow is uneven, and adjactent wells get cross-contaminated, moisture can condense in, etc. Turning on a big synthesizer can be unwieldy for a small-scale exploratory work, and it can take struggle to adapt the automated system to sensitive reactions that need lots of care and attention. Manual combichem experiments can be faster and easier to control. More than few times I saw a big $ 200k all-teflon automatic synthesizer system gathering dust in the corner – because the chemists got tired of the hassle.

Over the years, combichem gradually evolved into parallel synthesis of small sets (tens or hundreds) of individual compounds. Nowadays the parallel-synthesis-produced compounds are purified – so there is no actuall difference from traditional medchem. The main impact that combichem had was popularizing the high-throughput solid-phase methods in the synthetic chemistry.


The first combichem company that I joined (14 years ago) is still in business - they underwent a number of mergers and are now a part of Sanofi. Their new owners like them because they got good hits from their libraries with a reasonable frequency. (The chemists there have a pretty decent system for cranking out the compounds - too bad that they are discouraged from publishing their results.) It has always been a fairly small operation throughout – but unlike many other combichem companies it survived until today.

I think the future is in outsourcing.  We have recently screened some commercially-available collections that were produced by a traditional chemistry abroad: 90%+ purified and bottled.  I was very surprised by the number and high quality of the resulting screening  hits.  A set of 1000 dissimilar compounds that were made individually, pure, is in my opinion more usefull than 10 combichem general libraries with 1000 compounds each.

[This post was inspired by Derek Lowe's In the Pipeline]

September 6, 2007

Copycat drugs

Filed under: industry life — milkshake @ 1:13 am

storks.jpg Credit Jiri Sliva

Levitra is a close Viagra knock-off and there is little difference between the two for the patient. “Money wasted on developing yet another version of the erection pill could have been put into malaria research instead.” And so on.

I am not a fan of Big Pharma but I would like to point out that the misery of the Third World was not caused by the lack of drugs or by their high prices - the root problems have to do more with awful governments and wars. But if most patients there cannot pay for the new antimalarials / HIV inhibitors / tuberculostatics, the research interest will get re-directed elsewhere and one can hardly blame the industry for that.  (Another complication is that even if the drugs are provided for free, it is hard to get these patients to take their medications in a disciplined manner for the prescribed period of time - and hence the multidrug resistance emerges soon)


Sepracor business plan used to be entirely based on patent-busting; Sepracor was scorned for it by many people in the industry. Yet this kind of “bottom-feeding” research saved some good drugs from oblivion - drugs that would not be introduced into US otherwise. As a result, a next generation of drugs appeared in previously stagnant fields. (Antihistaminics, sleep meds.) 

A competition from me-too drugs produces pressure that pushes the innovation - without it, companies selling popular drugs are in no hurry developing their more advanced follow-on candidates if their established drug is doing fine. 

A late-comer drug needs to offer some demonstrable advantage over the more established ones. The me-too drug can for example have a different pharmacology from the original drug – whether this was a part of the design or just a coincidence. (The drug distribution, metabolism, drug-drug interactions and side-effect profile are notoriously hard to predict: a small change in the structure can have an important effect which might not be apparent until the testing is done in patients). The main reasons for doing the me-too medchem research is that it is a lot easier: Developing a new structural class from a low-potency screening hit can take years and there is no guarantee that this optimization will ever produce a drug candidate. Starting with a proven competitor’s compound and modifying it in few places greatly shortens the development time and improves the chances of success. 

It is just impossible to exhaust all possibilities inherent in a class of molecules. The chemists and biologists in the team that discovered the original drug must have made choices at numerous points of the project- when they picked the directions they thought were worth pursuing. Their project had deadlines, it was influenced by personal experience and bias.  Independent re-visiting the original data and premises with the hindsight knowledge of the clinical performance and with a competitive mindset can lead to surprises, and eventually to an improved drug candidate.  

There is also a natural tendency of compounds coming out from different groups to converge:  It is a common practice in the industry to re-synthesize and test the published compounds. The added insight from their testing (especially when combined with the X-ray crystallography) can quickly close the gap between the competing groups. This can also produce a resemblance of a me-too approach.

September 4, 2007

Soliciting advice on diazomethane

Filed under: questions — milkshake @ 9:49 pm

diazo.jpg Credit: Jiri Sliva

I have been doing Arndt-Eistert homologation (example: OrgSyn 79, p.154, 2002) so I needed pure anhydrous diazomethane solution. I made diazomethane from nitrosomethyl urea (NMU) and to avoid the distillation, I transfered diazomethane from my KOH-dried diazomethane solution in toluene by passing a stream of Ar through it (at R.T. for 30 min) and condensing the liberated dry Ar-diluted diazomethane gas directly into the reaction mixture cooled to -78C. (I heard of this alternative of diazomethane solution distillation from Rapoport group – and it worked, though I cannot recomend doing this kind of gas transfer on large scale because diazomethane is so nasty and toxic.)

I have only a limited experimental experience with diazomethane. There is plenty literature on the subject. But I would like to ask you about your personal perspective: What is you favorite method of making high-purity dry diazomethane solutions (that can be used for sensitive applications such as Arndt-Eistert)? 

One reason why I am asking this is that it apears to me that all the commonly used precursors of diazomethane have some problems associated with their use: Diazald needs a presence of alcohol (preferably a high-boiling one) or a phase-transfer catalyst to work with aqueous hydroxide and the produced diazomethane solution should be distilled to remove the sideproducts. MNNG is highly toxic and difficult to buy nowadays. NMU is expensive, explosive, very carcinogenic and the NMU-produced diazomethane solution contains trace of methylamine. So I would be delighted to learn what you have tried and liked.

The other reason is that I have one rather simple-minded idea – that I would like to try, to see if it works –  for a method of preparing contaminant-free anhydrous diazomethane solution without the risky distillation step. I would like to know if you think it is a worthwhile use of lab time, if you think a new method is even needed - a method that would be convenient and work with benign commercial reagents. 

The Shocking Blue Green Theme Blog at


Get every new post delivered to your Inbox.

Join 137 other followers