We live in a time when many religious people feel fiercely threatened by science. O ye of little faith. Let them subscribe to Scientific American for a year and then tell me if their sense of the grandeur of God is not greatly enlarged by what they have learned from it. Of course many of the articles reflect the assumption at the root of many problems, that an account, however tentative, of some structure of the cosmos or some transaction of the nervous system successfully claims that part of reality for secularism. Those who encourage a fear of science are actually saying the same thing. If the old, untenable dualism is put aside, we are instructed in the endless brilliance of creation. Surely to do this is a privilege of modern life for which we should all be grateful.
The next time you stub your toe, take out a telescope and look at your foot through the wrong end: According to researchers at Oxford University, such visual distortions have a powerful effect on how we perceive pain.
The scientists found that subjects who looked at a wounded hand through the right end of a pair of binoculars felt more pain and experienced increased swelling in that limb. But when the binoculars were flipped around, the suffering and swelling were lessened dramatically.
Science Watch: What technological applications do you foresee for graphene, and are we going to need new technologies to create it to make these applications viable?
. . . I’m always very skeptical about applications. When someone asks about applications in my talks, I usually tell a story about how I was on a boat one day watching dolphins, and they were jumping out of the water, allowing people to nearly touch them. Everyone was mesmerized by these magnificent creatures. It was an extraordinary romantic moment—well, until a little boy shouted out, "Mom, can we eat them?" It's a similar matter here—as in, okay, we just found this extraordinary material, so we're enjoying this romantic moment, and now people are asking if we can eat it or not. Probably we can, but you have to step back and enjoy the moment first.
—2010 physics Nobel laureate Andre Geim, in a 2008 ScienceWatch interview, on preserving the romance of discovery
In the story of evolution as we know it, there is no Eden. The world was never peaceful and perfect, free from death and pain. Rather the upward curve has been constant and gradual, always accompanied by death, which culls those creatures less than fit for survival.
Nor does evolution tell a story of Adam and Eve. The human race also has risen incrementally. Evolution suggests no story of atonement, for it knows nothing of personal responsibility, law, sin, shame or redemption.
The absence of such information in the story of evolution does not suggest to me, as it does to Haught, that we must throw out large parts of the Bible’s story. One learns different things from different sources—some things from science, some things from music, some things from the Bible. On purely empirical grounds, I would insist that sin is as reliable a fact as can be, even if evolution knows nothing of it. The same with beauty and truth and love—equally missing from evolution as Darwin traces it. Evolution knows of populations, not individuals. It could never chastise Cain for killing his brother. Murder is part of evolution’s mechanism of change. It would be a good thing, if evolution knew anything about “good.”
“Good” is not part of evolution’s story. Nevertheless good exists.
In 1953, Dr. Borlaug began working with a wheat strain containing an unusual gene. It had the effect of shrinking the wheat plant, creating a stubby, compact variety. Yet crucially, the seed heads did not shrink, meaning a small plant could still produce a large amount of wheat.
Dr. Borlaug and his team transferred the gene into tropical wheats. When high fertilizer levels were applied to these new “semidwarf” plants, the results were nothing short of astonishing. The plants would produce enormous heads of grain, yet their stiff, short bodies could support the weight without falling over. On the same amount of land, wheat output could be tripled or quadrupled. Later, the idea was applied to rice, the staple crop for nearly half the world’s population, with yields jumping several-fold compared with some traditional varieties. This strange principle of increasing yields by shrinking plants was the central insight of the Green Revolution, and its impact was enormous.
Anybody who has experienced fatherhood or motherhood knows about the power of the infants. The arrival of a baby completely changes the structure and life of the whole family. One could say actually that the infant is the one who has the authority. The activities of the whole family are ordered to his needs. What is true for infants is also true for sick, handicapped and aged people. As I have argued above, they have a real power of reorganization of the human communities. But I believe that the experience repeatedly made by humans is that there is something beyond. Entering into relation with the weak may become an experience of discovery and acceptation of our own weaknesses. Discovering indeed that whenever I recognize that I am weak, then I am strong. And entering through this experience into a world of fragility and vulnerability that we share with our friends who have made the same experience, a world that becomes a world of kindness, mercy and love.
According to the conventional wisdom still taught in schools and repeated by many public intellectuals, Galileo bravely spoke truth (science) to power (the Church), and paid dearly for it, spending his dying days in prison. Except that it's not true. Ronald L. Numbers' Galileo Goes to Jail: And Other Myths About Science and Religion, just out from Harvard University Press, is only the most recent attempt to set the historical record straight on "myths", including its Number Eight: That Galileo Was Imprisoned and Tortured for Advocating Copernicanism. Apparently Carl Sagan's quip that Galileo was "in a Catholic dungeon threatened with torture" has all the academic rigor of the Indigo Girls song that begins "Galileo's head was on the block."
Consider: Galileo's Dialogue Concerning the Two Chief World Systems, the source of controversy, previously had been read and approved by the Church's censors; and Pope Urban VIII, who presided over the trial, was Galileo's friend and admirer. Consider also: prior to the trial, Galileo stayed in the Tuscan embassy; during the trial, he was put up in a six-room apartment, complete with servant; following the trial, his "house arrest" consisted of being entertained at the palaces of the grand duke of Tuscany and the Archbishop of Siena. Galileo, apparently, was no ordinary heretic.
Behavior | From eating vultures to clear up syphilis to treating H.I.V. with garlic and beetroot, quack medicine persists in folk remedies around the world, writes Ewen Callaway in New Scientist. Now an Australian study describes the cascades of human gullibility that help explain why.
Put simply, person X uses snake oil to treat her goiter, arthritis or what have you. Seeing this, friends assume snake oil works and more follow suit. Since it doesn’t work and X persists in using snake oil, more gullible people are exposed to the folly and fall for it than if X had been quickly cured with effective treatment.
Four out of five hucksters couldn’t have done better. [New Scientist]
Science is fueled by passion, a passion that is often attached to the world of objects much as the artist is attached to his paints, the poet to her words. From my first days at the Massachusetts Institute of Technology in 1976, I saw this passion for objects everywhere. My students and colleagues told how they were drawn into science by the physics of sand castles, by playing with soap bubbles, by the mesmerizing power of a crystal radio.
Since this was the early days of computer culture, there was also talk of new objects. Some people identified with their computers, experiencing these machines as extensions of themselves. For them, computers were useful for thinking about larger questions, questions of determinism and free will, of mind and mechanism ...
Objects don’t nudge every child toward science, but for some, a rich object world is the best way to give science a chance. Given the opportunity, children will make intimate connections, connections they must construct on their own ...
If we attend to young scientists’ romance with objects, we are encouraged to make children comfortable with the idea that falling in love with things is part of what we expect of them. We are encouraged to introduce the periodic table as poetry and LEGOs as a form of art.
Rather than infer that nanotechnology is safe, members of the public who learn about this novel science tend to become sharply polarized along cultural lines, according to a study conducted by the Cultural Cognition Project at Yale Law School in collaboration with the Project on Emerging Nanotechnologies. The report is published online in the journal Nature Nanotechnology.
These findings have important implications for garnering support of the new technology, say the researchers.
The experiment involved a diverse sample of 1,500 Americans, the vast majority of whom were unfamiliar with nanotechnology, a relatively new science that involves the manipulation of particles the size of atoms and that has numerous commercial applications. When shown balanced information about the risks and benefits of nanotechnology, study participants became highly divided on its safety compared to a group not shown such information.
The determining factor in how people responded was their cultural values, according to Dan Kahan, the Elizabeth K. Dollard Professor at Yale Law School and lead author of the study. “People who had more individualistic, pro-commerce values, tended to infer that nanotechnology is safe,” said Kahan, “while people who are more worried about economic inequality read the same information as implying that nanotechnology is likely to be dangerous.”
According to Kahan, this pattern is consistent with studies examining how people’s cultural values influence their perceptions of environmental and technological risks generally. “In sum, when they learned about a new technology, people formed reactions to it that matched their views of risks like climate change and nuclear waste disposal,” he said.
It’s hard to be sure whether the big science projects?—?which can take a significant percentage of the funding from the NIH, for example?—?are ultimately going to be as productive as typical investigator-initiated science projects. My own view is that what’s consistently propelled American scientific success has been individual, investigator-initiated science projects. I don’t imagine that will change too much. That’s not to say that the larger projects?—?for example, the genome-sequencing projects?—?are not worth it. Obviously, some of them are. Some people will be motivated by pursuing the X Prize to try things that they never would have done otherwise. A certain number of these catalytic events are really worth it. But I tend to favor the creative and individually masterminded, out-of-left-field kind of science, which often ends up being the most transformative. I’m confident that much of the truly original ideas come from people doing things that they are passionate about and then stumbling onto something completely unexpected. Certainly, the biological field is strewn with examples of great discoveries?—?absolutely revolutionary discoveries?—?that came out of seemingly trivial things. It’s not very often that big science leads you to true innovation in the sense of novel discoveries.
Creative ideas are not always solo strokes of genius, argues Ed Catmull, the computer-scientist president of Pixar and Disney Animation Studios, in the current issue of the Harvard Business Review. Frequently, he says, the best ideas emerge when talented people from different disciplines work together.
This week, Nature begins a series of six Essays that illustrate Catmull’s case. Each recalls a conference in which a creative outcome emerged from scientists pooling ideas, expertise and time with others — especially policy-makers, non-governmental organizations and the media. Each is written by someone who was there, usually an organizer or the meeting chair. Because the conferences were chosen for their societal consequences, we’ve called our series ‘Meetings that Changed the World’.
This week, François de Rose relives the drama of the December 1951 conference at the UNESCO headquarters in Paris that led to the creation of CERN, the European particle-physics laboratory based near Geneva (see page 174). De Rose, then France’s representative to the United Nations Atomic Energy Commission, chaired the meeting. He had got caught up in the process after becoming friends with Robert Oppenheimer, one of CERN’s earliest proponents. De Rose said in a separate interview with Nature that CERN was the result of the capacity of scientists such as Oppenheimer to propose grand ideas, and worry about obstacles later.
Although this approach does not always work, the next few weeks will show that it really has changed the world. In the ensuing half-century, CERN has revolutionized our understanding of the subatomic world; with the switching-on this week of the Large Hadron Collider (see page 156) it promises to scale new heights.
Armchair philosophers sometimes defend the purity of “Science” by distinguishing it from technology or applied science, a move resembling hip America’s affection for the idea of soccer, but not the game itself. Separate scientists from tools and applications, and what’s left? A feeble enterprise, a succession of conjectures.
When applied, science sometimes delivers but always—always—graces humanity with unexpected consequences. Nothing infuriates my literature and medicine students as much as Wendell Berry’s observation that “medicine is an exact science until applied,” and nothing they learn in their four years of medical school is more urgent and more true.
Barry Canton, a 28-year-old biological engineer at the Massachusetts Institute of Technology, has posted raw scientific data, his thesis proposal, and original research ideas on an online website for all to see.
To young people primed for openness by the confessional existence they live online, that may not seem like a big deal. But in the world of science—where promotions, tenure, and fortune rest on publishing papers in prestigious journals, securing competitive grants, and patenting discoveries—it’s a brazen, potentially self-destructive move. To many scientists, leaving unfinished work and ideas in the open seems as reckless as leaving your debit card and password at a busy ATM machine.
Canton is part of a peaceful insurgency in science that is beginning to pry open an endeavor that still communicates its cutting-edge discoveries in much the same way it has since Ben Franklin was experimenting with lightning. Papers are published in research journals after being reviewed by specialists to ensure that the methods and conclusions are sound, a process that can take many months.
“We’re a generation who expects all information is a Google search away,” Canton said. “Not only is it a Google search away, but it’s also released immediately. As soon as it happens, the video is up on YouTube and on all the blogs. The old model feels kind of crazy when you’re used to this instant information.”
He once thought, he said, that the way to be a moral scientist was to avoid projects with bad applications. But he had changed his mind. The vital thing was to stay involved; to speak, write, testify, and make sure that research was turned not to evil, but to good. For more than 20 years he taught bioethics at Yale, a course he had started and which, by his last year, was one of the most popular in the college. His country forgot, but he did not, the mangrove ghosts.