And avian eyes! A very nice piece about quasicrystals and random packing and how all these are related to the arrangment of colour sensitive cone cells in the eyes of birds. Extremely well written and strongly recommended.
Archive for the ‘Biology’ Category
We have heard about Verghese Kurien’s switch to metallurgy thanks to his reaching Michigan State University to study diary engineering. Now I learn from here (thanks to Arunn at Nanopolitan) that a mix-up in the admissions office allowed Sir John Gurdon to study zoology instead of classics and he is the Nobel prize winner in Physiology and Medicine this year!
After receiving the report Sir John said he switched his attention to classics and was offered a place to study at Christ Church, Oxford, but was allowed to switch courses and read zoology instead because of a mix-up in the admissions office.
It was at Oxford as a postgraduate student that he published his groundbreaking research on genetics and proved for the first time that every cell in the body contains the same genes.
He did so by taking a cell from an adult frog’s intestine, removing its genes and implanting them into an egg cell, which grew into a clone of the adult frog.
The idea was controversial at the time because it contradicted previous studies by much more senior scientists, and it was a decade before the then-graduate student’s work became widely accepted.
But it later led directly to the cloning of Dolly the Sheep by Prof Ian Wilmut in 1996, and to the subsequent discovery by Prof Yamanaka that adult cells can be “reprogrammed” into stem cells for use in medicine.
The report in question is very interesting too. Take a look!
Two interesting papers from PNAS:
Take a look!
A commentary in BMC biology that is written in a very nice fashion (and very interesting too):
In summary, neural networks have been designed to have outputs that degrade gracefully as network elements are eliminated or their properties perturbed. Such a design principle makes the networks work better for the animals, but simultaneously makes life harder for neuroscientists who want to learn how the network works by making measurements on the network as it does its job.
Take a look!
Oh! Just for that title I am writing this post about this piece in PNAS:
Hummingbird tongues pick up a liquid, calorie-dense food that cannot be grasped, a physical challenge that has long inspired the study of nectar-transport mechanics. Existing biophysical models predict optimal hummingbird foraging on the basis of equations that assume that fluid rises through the tongue in the same way as through capillary tubes. We demonstrate that the hummingbird tongue does not function like a pair of tiny, static tubes drawing up floral nectar via capillary action. Instead, we show that the tongue tip is a dynamic liquid-trapping device that changes configuration and shape dramatically as it moves in and out of fluids. We also show that the tongue–fluid interactions are identical in both living and dead birds, demonstrating that this mechanism is a function of the tongue structure itself, and therefore highly efficient because no energy expenditure by the bird is required to drive the opening and closing of the trap. Our results rule out previous conclusions from capillarity-based models of nectar feeding and highlight the necessity of developing a new biophysical model for nectar intake in hummingbirds. Our findings have ramifications for the study of feeding mechanics in other nectarivorous birds, and for the understanding of the evolution of nectarivory in general. We propose a conceptual mechanical explanation for this unique fluid-trapping capacity, with far-reaching practical applications (e.g., biomimetics).
That is the way to progress; the seed article talks about how bio-engineering benefits from tool making and sharing; it is true for scientific software too, by the way:
We do not know how to make biology easy to engineer (think playing with Legos or coding software with Java). However, technical inventions prototyped over the past six years point the way to a future in which biology is much easier to engineer relative to today. For example, in the summer of 2009, a team of undergraduates at the University of Cambridge won the International Genetically Engineered Machines (iGEM) competition by engineering seven strains of E. coli, each capable of synthesizing a different pigment visible to the naked eye. The resulting set, collectively known as E. chromi, required rerouting the metabolism of the bacteria so that natural precursor chemicals are converted across a palette of seven colors, from red to purple; such genetic color generators can be used to program microbes to change color in response to otherwise invisible environmental pollutants or health conditions. A few years ago such a project would have required several PhD-level experts in biology and metabolic engineering and would have likely taken a few years. Today, undergraduates can perform such work in months. This change in reality is due to two advances—tools and sharing—both of which are ready for their own revolutions.
Take a look!
I was very impressed by Schroedinger’s What is life; as is my wont, when I read the book (nearly 12 years ago), I filled a large number of sheets with notes. However, I have never revisited the book — though, on and off, I have been thinking that I should re-read What is life (and, another that I loved and want to re-read is Hardy’s A mathematician’s apology, by the way).
The latest BioMed Central carries a comment piece and a Q&A piece on biophysics, both of which make very interesting reading. In the comment piece by Gratzer, I found the following very interesting commentary about Schroedinger’s book (which is the reason for my renewed interest in re-reading the book — to see how it sounds to me now):
Biophysics was seen, then, as a prop for the serious business of physiology, and it later also became conflated with medical physics, which essentially meant radiology. Hospital physicists would have ranked low in the medical hierarchy and in the esteem of the physics fraternity. The Olympians, such as Bohr (with his obiter dicta on the implication of the Complementarity Principle for biology) and Schrödinger, ruminated languidly on the nature of life, but biologists who, as Peter Medawar put it, ‘operate at the frontier between bewilderment and understanding’, were not generally regarded in such quarters as altogether scientifically house-trained. The Victorian physicist PG Tait spoke of ‘minds debauched by the so-called science of biology’, while for Rutherford there were only physics and stamp-collecting. But then one of their own, no less than Erwin Schrödinger, came out with a slim volume with the modest title, What is Life? It appeared in 1944, a few months before the end of the Second World War, and it received close attention from physicists and physical chemists, many of them wearied by years in war work, and in want of a fresh outlet for their talents. It is remarkable indeed how many of the founders of the new biology were animated by Schrödinger’s little book. For its message was that biology really was physics, despite the apparent conflict between life and thermodynamic imperatives, and especially that the vehicle of heredity, so far from being a kind of intangible essence, would turn out to be an ‘aperiodic crystal’. The concept was never properly defined, but it carried the alluring implication that it might be open to study by established physical methods, most obviously X-ray diffraction.
It was only much later that some of those who had been captivated by Schrödinger’s dissertation began to wonder why they had so uncritically swallowed it all. Max Perutz reflected in 1987 on the author’s sleight of hand: ‘A close study of the book and of the related literature has shown me that what was true of the book was not original, and most of what was original was known not to be true even when it was written’. More, ‘the apparent contradiction between life and the statistical laws of physics can be resolved by a science largely ignored by Schrödinger. That science is chemistry’. Perutz’s strictures, it should be said, did not go unchallenged, and drew, in particular, a lucid response from an eminent geneticist and quondam associate of Schrödinger’s, Neville Symonds. At all events there is no doubting the book’s effect in making biophysics attractive to many and at least halfway respectable. It is curious though that its rise was prefigured in a novel, published in 1934: The Search by CP Snow has for its hero a visionary young physicist who procures funding to set up in London, at a location plainly King’s College, a department of biophysics.
Any case, both the comment and Q&A pieces are worth your while. Have fun!
A very interesting piece in BioMed Central by Wolpert and Flanagan on the workings of the brain as understood using robots.
Research. In this very interesting article in BioMed Central, Gregory Petsko writes about three phases of drug testing during the development of newer drugs. The first phase is for the assessment of toxicity of the drug to humans and is tested on healthy population. The second phase is for assessing how well the drug works with patients with the disease. The third phase tests are for assessing the effectiveness of the drug in comparison to what is available at the moment in the market.
Apparently, 19 out of 20 drug trails fail. And, most of them fail at phase two. And, the interesting point that Petsko makes is:
My main point is that the Phase II failures represent an enormous, untapped resource for the biomedical sciences – a resource that could go a long way towards solving the problem of low productivity, in terms of cures, that plagues both industry and academic medicine.
You see, the Phase II failures have all passed Phase I, so they have been shown to be safe in humans. They failed for efficacy. They failed because they did not effectively treat the disease they were intended to treat, even though they showed biological activity in assays and model systems. There are hundreds of them – perhaps more than a thousand. I don’t know the number because drug companies bury those failures. They don’t want to release a lot of information about the molecules in question because, among other things, they fear that will give their competitors too much of an insight into what they are working on. But here’s the question I would like you – and them – to ponder. What if those drugs were not tried on the right disease?
We now know that many quite different diseases share common pathways and processes in the cell. Cancer is a disease of abnormal cell survival; in Alzheimer’s disease the survival pathways have failed. Alzheimer’s patients have significantly lower risk of many cancers. What if the cure for Alzheimer’s disease is sitting on some drug company’s shelf, as a potential cancer drug that failed in Phase II? (A biotech company called Link Medicines is currently testing one such failure to find out.) Gaucher disease and Parkinson’s disease both involve lysosomal damage and display aggregates of a protein called alpha-synuclein; Gaucher carriers are at elevated risk for Parkinson’s. What if a drug intended to cure Gaucher disease, one that failed in Phase II, is actually a treatment for Parkinson’s? (Another biotech company, Amicus Therapeutics, is beginning to investigate that possibility.) Recent studies show that people diagnosed with psoriasis are at greater risk of developing heart disease; in fact, in patients with severe psoriasis who are younger than 50 years old, the risk is comparable to that seen in diabetes. How many Phase II-failed psoriasis drugs have ever been tested in heart disease clinical trials?
A very interesting piece!
PS: While you are at it, these two pieces on mammalian pheremones are interesting too: