Print This Post Print This Post 1,455 views
Jun 22

Reported by Quirin Schiermeier, in Nature News, 474, 265 (2011), 14 June 2011.

Regulators scramble to recover millions of euros awarded to fake research projects.

Stifling bureaucracy is often blamed for discouraging scientists and businesses from participating in the research programmes of the European Commission (EC). But the commission’s notoriously cumbersome procedures and rigid control mechanisms have apparently not prevented a criminal syndicate from conducting a brazen fraud that has siphoned off millions in EC grant funds.

Italian authorities and the European Anti-Fraud Office (OLAF) in Brussels, Belgium, have confirmed that they are prosecuting members of a large network accused of pocketing more than €50 million (US$72 million) in EC grants for fake research projects. In Milan, Italy, the Finance Police last month charged several individuals in relation to the fraud. In Brussels, meanwhile, the EC has terminated four collaborative projects in information technology, and excluded more than 30 grant-winners from participation in around 20 ongoing projects. Investigations are still under way in the United Kingdom, France, Greece, Austria, Sweden, Slovenia and Poland.

“We don’t have any records of [previous] fraud at such a scale,” says David Boublil, the commission’s spokesman for taxation, customs, anti-fraud and audit. While investigations continue, Italian prosecutors and OLAF will not disclose the names of the suspects, or the research projects with which they were involved.

The fraud has been conducted in a “highly sophisticated manner, resembling money laundering”, by means of a cross-border network of fictitious companies and subcontractors, says Pavel Bořkovec, a spokesman for OLAF. Several project coordinators stand accused of having claimed inflated costs, or expenses for non-existent research activities and services, he says.

“The projects were apparently organized with the sole intention to deceive the commission and its control mechanisms,” says Boublil. To make them seem legitimate, grant applications included the names of real scientists, established research institutes and existing companies, he says. But in most cases the alleged project partners were included without their knowing.

Insiders in Brussels say that rare cases of minor financial dishonesty, from inflated invoices to smaller cases of embezzlement, are regarded as unavoidable in large collaborative research projects. But the commission does extensive checks on project partners, including companies, which are meant to catch large-scale fraud. The success of the fraud suggests that those involved were unusually familiar with weaknesses in the EC’s procedures, and adept at forging legal documents.

Boublil insists that the commission has learned lessons from the case. All departments handling research grants — including the EC’s Information Society and Media Directorate General, which oversaw the terminated projects — are now trained to look out for the methods used by the network. Guidelines for evaluating projects and their partners are set to be updated. The EC has already recovered €10 million of the money, and will seek to recover the rest through the courts, Boublil says.

The commission is currently developing a multibillion-dollar ‘Common Strategic Framework’ which, from 2014, will combine its various funding streams into a single channel for all research and innovation funding. Concerned about the burden of Brussels bureaucracy, several thousand European scientists signed a petition this year ( calling for the framework to be “based on mutual trust and responsible partnering”. Some now fear that the fraud could hamper efforts to cut red tape.

“I’m worried that some will argue that what has happened proves that we need more rather than less control,” says Herbert Reul, chair of the European Parliament’s committee on industry, research and energy, which supports the simplification of the EC’s funding procedures. “I sincerely hope that this will not happen. Actually, it is a good sign that this worrying attempt at deceiving the commission has been discovered and will be punished.”

Tagged with:
Print This Post Print This Post 1,203 views
Jun 17

Reported by Lisa Grossman, in Wired Science, 14 June 2011.

Computers that run on chips made from tiny magnets may be as energy-efficient as physics permits.

schematic of how nanomagnetic computers store and work with information using the direction of magnetic north. Courtesy of Brian Lambson.

According to new calculations, if nanomagnetic computers used any less energy, they’d break the second law of thermodynamics. Such computers are still semi-theoretical, but they could someday be used in the deep oceans or even deep space, where energy is at a premium.

If nothing else, nanomagnetic laptops wouldn’t overheat.

“They’re actually maximally efficient, in the sense that they use up only the energy that is theoretically required to carry out a computation,” said electrical engineer Brian Lambson of the University of California at Berkeley. The results will be published in Physical Review Letters.

Conventional computers process information by shuttling electrons around circuits. But though electrons have miniscule mass, it takes a surprising amount of energy to move them. Even the most advanced computers use far more energy than they theoretically need.

That theoretical energy limit was set by IBM physicist Rolf Landauer, who argued in 1961 that altering a single bit of information will always produce a tiny amount of heat. No matter how the computer is built, Landauer claimed, no change can occur without an accompanying transfer of energy. Most computers devour up to a million times more energy than this “Landauer limit” every time they do a calculation.

Nanomagnetic chips are made from material similar to refrigerator magnets, etched with rows of rectangles. Each rectangle measures about 100 nanometers on a side and has magnetic poles. Information is stored in how they point: One configuration is 1, the other is 0. Because the magnets are so small, they can be packed close enough for their magnetic fields to interact. Information passes without any physical changes to the chip.

“Magnetic systems are unique in that they have no moving parts,” Lambson said. “Moving parts are really what dissipate a lot of energy in physical systems, whether it’s moving electrons or physical material.”

Nanomagnetic chip design is still in its infancy, far from optimally efficient. But to see how little energy the chips might consume, Lambson’s team estimated how magnetic fields would change during computation, then calculated the energy required to make those changes.

The results were close to Landauer’s limit. “We were surprised to see that they were almost exactly the same,” Lambson said.

Using magnets to build ultra-efficient computers is a powerful idea, said nanomagnetic logic pioneer Wolfgang Porod of the University of Notre Dame, who was not involved in the new work.

The Landauer limit, however, may not actually represent the limits of efficiency. “These arguments still are somewhat controversial,” Porod said. “The argument used to be more academic. But with devices getting smaller and smaller, these arguments are hitting closer to home.”

Reference: Brian Lambson, David Carlton and Jeffrey Bokor, “Exploring the Thermodynamic Limits of Computation in Integrated Systems: Magnetic Memory, Nanomagnetic Logic and the Landauer Limit.” Physical Review Letters, in press.

Tagged with:
Print This Post Print This Post 1,198 views
Jun 16

Reported by AlphaGalileo, Technical University of Denmark (DTU), 16 June 2011.

With 4000 times as much memory as an ordinary PC, the new supercomputer enables DTU researchers to rapidly identify new genes and proteins that can be used in future sustainable biotechnology industrial processes. The computer, which is an Altix UV 1000 model supplied by SGI, has been named Anakyklosis which is the Greek word for recycling. The name reflects its importance to a biologically sustainable future.

“Systems Biology involves research that combines and integrates extremely large data sets, including genetic information. The new supercomputer has a huge so-called ‘shared memory’, which enables us to handle these data more quickly and flexibly. The computer’s capacity will considerably expand our ability to answer the basic biological questions we face, such as how to get a cell to produce something it was not originally made for,” explains Søren Brunak, director at the Center for Biological Sequence Analysis at DTU Systems Biology, where Anakyklosis already has been linked up to other supercomputers.

“The need for larger and faster computers has become very urgent due to the development of the metagenomics research area. This deals with mapping the entire genome content of bacterial communities, such as those found in the deep oceans, in wastewater or in our own gut. The resulting amount of data is several thousand times larger than the entire human genome, “says senior researcher Nikolaj Blom from the new Novo Nordisk Foundation Center for Biosustainability at DTU, which has funded the computer. Metagenomics systems biology will be one of the center’s six main research areas, and the supercomputer will be part of the center’s search for new enzymes for the biotech industry and the construction of biological cell factories. The goal of the center’s research is to produce tomorrow’s chemicals in living cells from inexpensive and sustainable raw materials and thereby reduce world dependence on oil.

“It’s mainly the memory of our computers, which currently limits how much data we can process at a time, and thus how quickly we make progress with our analysis,” says Thomas Sicheritz-Pontén, who will direct the research in metagenomics at the Novo Nordisk Foundation Center for Biosustainability. “Anakyklosis can hold the equivalent of 2500 human genomes in its working memory at once, so it opens up new opportunities for systems biology research.”

Tagged with:
Print This Post Print This Post 1,152 views
Jun 03

Reported by Marcus Woo, June 2, 2011, in Caltech News.

In many ways, life is like a computer. An organism’s genome is the software that tells the cellular and molecular machinery—the hardware—what to do. But instead of electronic circuitry, life relies on biochemical circuitry—complex networks of reactions and pathways that enable organisms to function. Now, researchers at the California Institute of Technology (Caltech) have built the most complex biochemical circuit ever created from scratch, made with DNA-based devices in a test tube that are analogous to the electronic transistors on a computer chip.


A wiring diagram specifying a system of 74 DNA molecules that constitute the largest synthetic circuit of its type ever made. The circuit computes the square root of a number up to 15 and rounds down to the nearest integer (the discrete square root of a four-bit integer). [Credit: Caltech/Lulu Qian

Engineering these allows researchers to explore the principles of information processing in biological systems, and to design biochemical pathways with decision-making capabilities. Such circuits would give biochemists unprecedented control in designing chemical reactions for applications in biological and chemical engineering and industries. For example, in the future a synthetic biochemical circuit could be introduced into a clinical blood sample, detect the levels of a variety of molecules in the sample, and integrate that information into a diagnosis of the pathology.

“We’re trying to borrow the ideas that have had huge success in the electronic world, such as abstract representations of computing operations, programming languages, and compilers, and apply them to the biomolecular world,” says Lulu Qian, a senior postdoctoral scholar in bioengineering at Caltech and lead author on a paper published in the June 3 issue of the journal Science.

Along with Erik Winfree, Caltech professor of computer science, computation and neural systems, and bioengineering, Qian used a new kind of DNA-based component to build the largest artificial biochemical circuit ever made. Previous lab-made biochemical circuits were limited because they worked less reliably and predictably when scaled to larger sizes, Qian explains. The likely reason behind this limitation is that such circuits need various molecular structures to implement different functions, making large systems more complicated and difficult to debug. The researchers’ new approach, however, involves components that are simple, standardized, reliable, and scalable, meaning that even bigger and more complex circuits can be made and still work reliably.

“You can imagine that in the computer industry, you want to make better and better computers,” Qian says. “This is our effort to do the same. We want to make better and better biochemical circuits that can do more sophisticated tasks, driving molecular devices to act on their environment.”

To build their circuits, the researchers used pieces of DNA to make so-called logic gates—devices that produce on-off output signals in response to on-off input signals. Logic gates are the building blocks of the digital logic circuits that allow a computer to perform the right actions at the right time. In a conventional computer, logic gates are made with electronic transistors, which are wired together to form circuits on a silicon chip. Biochemical circuits, however, consist of molecules floating in a test tube of salt water. Instead of depending on electrons flowing in and out of transistors, DNA-based logic gates receive and produce molecules as signals. The molecular signals travel from one specific gate to another, connecting the circuit as if they were wires.

Winfree and his colleagues first built such a biochemical circuit in 2006. In this work, DNA signal molecules connected several DNA logic gates to each other, forming what’s called a multilayered circuit. But this earlier circuit consisted of only 12 different DNA molecules, and the circuit slowed down by a few orders of magnitude when expanded from a single logic gate to a five-layered circuit. In their new design, Qian and Winfree have engineered logic gates that are simpler and more reliable, allowing them to make circuits at least five times larger.

Their new logic gates are made from pieces of either short, single-stranded DNA or partially double-stranded DNA in which single strands stick out like tails from the DNA’s double helix. The single-stranded DNA molecules act as input and output signals that interact with the partially double-stranded ones.

“The molecules are just floating around in solution, bumping into each other from time to time,” Winfree explains. “Occasionally, an incoming strand with the right DNA sequence will zip itself up to one strand while simultaneously unzipping another, releasing it into solution and allowing it to react with yet another strand.” Because the researchers can encode whatever DNA sequence they want, they have full control over this process. “You have this programmable interaction,” he says.

Qian and Winfree made several circuits with their approach, but the largest—containing 74 different DNA molecules—can compute the square root of any number up to 15 (technically speaking, any four-bit binary number) and round down the answer to the nearest integer. The researchers then monitor the concentrations of output molecules during the calculations to determine the answer. The calculation takes about 10 hours, so it won’t replace your laptop anytime soon. But the purpose of these circuits isn’t to compete with electronics; it’s to give scientists logical control over biochemical processes.

Their circuits have several novel features, Qian says. Because reactions are never perfect—the molecules don’t always bind properly, for instance—there’s inherent noise in the system. This means the molecular signals are never entirely on or off, as would be the case for ideal binary logic. But the new logic gates are able to handle this noise by suppressing and amplifying signals—for example, boosting a signal that’s at 80 percent, or inhibiting one that’s at 10 percent, resulting in signals that are either close to 100 percent present or nonexistent.

All the logic gates have identical structures with different sequences. As a result, they can be standardized, so that the same types of components can be wired together to make any circuit you want. What’s more, Qian says, you don’t have to know anything about the behind the circuit to make one. If you want a circuit that, say, automatically diagnoses a disease, you just submit an abstract representation of the logic functions in your design to a compiler that the researchers provide online, which will then translate the design into the DNA components needed to build the circuit. In the future, an outside manufacturer can then make those parts and give you the circuit, ready to go.

The circuit components are also tunable. By adjusting the concentrations of the types of DNA, the researchers can change the functions of the logic gates. The circuits are versatile, featuring plug-and-play components that can be easily reconfigured to rewire the circuit. The simplicity of the also allows for more efficient techniques that synthesize them in parallel.

“Like Moore’s Law for silicon electronics, which says that computers are growing exponentially smaller and more powerful every year, molecular systems developed with nanotechnology have been doubling in size roughly every three years,” Winfree says. Qian adds, “The dream is that synthetic biochemical circuits will one day achieve complexities comparable to life itself.”

Reference: Lulu Qian and Erik Winfree, “Scaling up digital circuit computation with DNA strand displacement cascades”, Science, vol. 332, no. 6034, pp. 1196-1201, 3 June 2011.

Tagged with:
Print This Post Print This Post 918 views
Jun 02

Reported by Louise Lerner, June 2, 2011, in PhysOrg.

A team of researchers from the University of Chicago and the U.S. Department of Energy’s (DOE) Argonne National Laboratory has demonstrated a method that could produce cheaper semiconductor layers for solar cells.

Inorganic surface ligands enable facile electron transport between quantum dots and opened novel opportunities for using nanostructures in solar cells.

The inorganic nanocrystal arrays, created by spraying a new type of colloidal “ink”, have excellent electron mobility and could be a step towards addressing fundamental problems with current solar technology.

“With today’s , if you want to get significant amounts of electricity, you’d have to build huge installations over many square miles,” said team leader Dmitri Talapin, who holds a joint appointment with Argonne and the university. But because current solar cells are based on silicon, which is costly and environmentally unfriendly to manufacture, they aren’t cost-effective over large areas. The challenge for scientists is to find a way to manufacture large numbers of solar cells that are both efficient and cheap.

One possibility to make solar cells more economically would be to “print” them, similar to how newspapers are printed. “You’d use a kind of ‘ink,’ stamped on using a roll technology with a flexible substrate,” Talapin said.

Solar cells have several layers of different materials stacked on top of each other. The team focused on the most important layer, which captures sunlight and converts it into electricity. This layer, made of a , must be able to transform light into negative and positive electrical charges but also easily release them to move further along the material to generate electrical current.

Arrays of quantum dots allow fabrication of solar cells by printing and other inexpensive techniques.

Many methods to grow the semiconductors need , but a cheaper approach would be to make them in solution. This, however, requires a precursor that is soluble. The team developed that using . Small grains of semiconductors, suspended in a liquid, are “glued” together with new molecules called “molecular metal chalcogenide complexes.” The process heats the material to about 200 degrees Celsius, much lower than the temperatures required for manufacturing silicon solar cells. The result is a layer of material with good semiconducting properties.

“The electron mobility for this material is an order of magnitude higher than previously reported for any solution-based method,” Talapin said.

The team used intense X-rays from the DOE Office of Science’s Advanced Photon Source at Argonne to watch as the semiconductor film was created.

“We believe that we could make very competitive with these nanoparticles,” Talapin said.

Talapin said the success played on the complementary partnership between the University of Chicago and Argonne’s Center for Nanoscale Materials. “At the university we have great students and postdocs who can do a lot of the theoretical chemistry, which requires a lot of manpower,” Talapin said, “but Argonne is a fantastic place to do research that requires sophisticated instrumentation and infrastructure.”

The paper, “Band-like transport, high and high photoconductivity in all-inorganic nanocrystal arrays”, was published in Nature Nanotechnology.

Provided by Argonne National Laboratory (web)

Tagged with:
preload preload preload