Thursday, August 20, 2015

Setting ground rules for nanotechnology research

In two new studies, researchers from across the country spearheaded by Duke Univ. faculty have begun to design the framework on which to build the emerging field of nanoinformatics.

Nanoinformatics is, as the name implies, the combination of nanoscale research and informatics. It attempts to determine which information is relevant to the field and then develop effective ways to collect, validate, store, share, analyze, model and apply that information—with the ultimate goal of helping scientists gain new insights into human health, the environment and more.

In the first paper, published in the Beilstein Journal of Nanotechnology, researchers begin the conversation of how to standardize the way nanotechnology data are curated.

Because the field is young and yet extremely diverse, data are collected and reported in different ways in different studies, making it difficult to compare apples to apples. Silver nanoparticles in a Florida swamp could behave entirely differently if studied in the Amazon River. And even if two studies are both looking at their effects in humans, slight variations like body temperature, blood pH levels or nanoparticles only a few nanometers larger can give different results. For future studies to combine multiple datasets to explore more complex questions, researchers must agree on what they need to know when curating nanomaterial data.

Quantum computing advance locates neutral atoms

For any computer, being able to manipulate information is essential, but for quantum computing, singling out one data location without influencing any of the surrounding locations is difficult. Now, a team of Penn State Univ. physicists has a method for addressing individual neutral atoms without changing surrounding atoms.

"There are a set of things that we have to have to do quantum computing," said David S. Weiss, professor of physics. "We are trying to step down that list and meet the various criteria. Addressability is one step."

Quantum computers are constructed and operate in completely different ways from the conventional digital computers used today. While conventional computers store information in bits, 1's and 0's, quantum computers store information in qubits. Because of a strange aspect of quantum mechanics called superposition, a qubit can be in both its 0 and 1 state at the same time. The methods of encoding information onto neutral atoms, ions or Josephson junctions—electronic devices used in precise measurement, to create quantum computers—are currently the subject of much research. Along with superposition, quantum computers will also take advantage of the quantum mechanical phenomena of entanglement, which can create a mutually dependent group of qubits that must be considered as a whole rather than individually.

Google Will Rebrand as Alphabet Inc.

The Google you once knew will no longer be Google, it’ll be Alphabet.

CEO Larry Page announced via Blogspot on Monday the creation of a new public holding company, Alphabet Inc., to replace Google Inc. in an attempt to make operations “cleaner and more accountable.”

“We’ve long believed that over time companies tend to get comfortable doing the same thing, just making incremental changes,” Page writes in his post. “But in the technology industry, where revolutionary ideas drive the next big growth areas, you need to be a bit uncomfortable to stay relevant.”

According to Page, Alphabet will be a collection of companies, one of which will still be named Google, albeit a more “slimmed down” version.

According to a filing with the U.S. Securities and Exchange Commission (SEC) from Google Inc., the Google business will include the search engine, ads, maps, apps, YouTube and Android, among other related technical infrastructure.

Paving the way for a faster quantum computer

Since its conception, quantum mechanics has defied our natural way of thinking, and it has forced physicists to come to grips with peculiar ideas. Although they may be difficult to digest, quantum phenomena are real. What's more, in the last decades, scientists have shown that these bizarre quantum effects can be used for many astonishingly powerful applications: from ultra-secure communication to hacking existing secure communications, and from simulating complex quantum systems to efficiently solving large systems of equations.

One of the most exciting and most difficult proposed quantum technologies is the quantum computer. Quantum logic gates are the basic building blocks of a quantum computer, but constructing enough of them to perform a useful computation is difficult. In the usual approach to quantum computing, quantum gates are applied in a specific order, one gate before another. But it was recently realized that quantum mechanics permits one to "superimpose quantum gates". If engineered correctly, this means that a set of quantum gates can act in all possible orders at the same time. Surprisingly, this effect can be used to reduce the total number of gates required for certain quantum computations.

All orders at once
A team led by Philip Walther recently realized that superimposing the order of quantum gates, an idea which was theoretically designed by the group of Caslav Brukner, could be implemented in the laboratory. In a superposition of quantum gate orders, it is impossible - even in principle - to know if one operation occurred before another operation, or the other way around. This means that two quantum logic gates A and B can be applied in both orders at the same time. In other words, gate A acts before B and B acts before A. The physicists from Philip Walther's group designed an experiment in which the two quantum logic gates were applied to single photons in both orders.

Combating the Life Science Data Avalanche

Big data has become a growing issue in science, as these data sets are so large and complex that traditional data processing applications are inadequate. This is especially true for the life science industry, where the growing size of data hasn’t been met with tools for analyzing and interpreting this data at the same rate, leading to what many call a “data avalanche.”

Life science researchers are seeing more next-generation sequencing data generation, more samples and deeper sequencing, and this data is increasing in complexity as researchers move from targeted panels to whole-exome and whole-genome sequencing. However, the tools appearing in the market often fail to incorporate system-level interpretation, leading to further issues.

In enters bioinformatics, an interdisciplinary field—including computer science, statistics, mathematics and engineering—which develops methods and software tools for understanding biological data.

As a result, to address the growing size of data, there’s an expansion of tools to take advantage of Cloud computing and storage. Many vendors are developing their own Cloud-enabled software platforms capable of hosting a variety of analysis applications. And while life science researchers want to leverage the Cloud, they only want to do so for the additional functionality it brings, such as access to large data sets, common annotation sources, data sharing and scalable computing—they don’t want to overcome the hassle of uploading unless there’s a real benefit.

Study calculates the speed of ice formation

Researchers at Princeton Univ. have, for the first time, directly calculated the rate at which water crystallizes into ice in a realistic computer model of water molecules. The simulations, which were carried out on supercomputers, provide insight into the mechanism by which water transitions from a liquid to a crystalline solid.

Understanding ice formation adds to our knowledge of how cold temperatures affect both living and non-living systems, including how living cells respond to cold and how ice forms in clouds at high altitudes. A more precise knowledge of the initial steps of freezing could eventually help improve weather forecasts and climate models, as well as inform the development of better materials for seeding clouds to increase rainfall.

The researchers looked at the process by which, as the temperature drops, water molecules begin to cling to each other to form a blob of solid ice within the surrounding liquid. These blobs tend to disappear quickly after their formation. Occasionally, a large enough blob, known as a critical nucleus, emerges and is stable enough to grow rather than to melt. The process of forming such a critical nucleus is known as nucleation.

To study nucleation, the researchers used a computerized model of water that mimics the two atoms of hydrogen and one atom of oxygen found in real water. Through the computer simulations, the researchers calculated the average amount of time it takes for the first critical nucleus to form at a temperature of about 230 K or -43 C, which is representative of conditions in high-altitude clouds.

They found that, for a cubic meter of pure water, the amount of time it takes for a critical nucleus to form is about one-millionth of a second. The study, conducted by Amir Haji-Akbari, a postdoctoral research associate, and Pablo Debenedetti, a professor of chemical and biological engineering, was published online in the Proceedings of the National Academy of Sciences.

A Regulatory Helping Hand

Increasingly scrutinized, regulatory agencies are imposing stricter guidelines for approving new therapies. This is mainly due to a number of high-profile drugs that were commercialized and then withdrawn after findings of adverse effects for patients. For decades, agencies have struggled to find a solution to bring higher-qualified drugs to market while minimizing risks in clinical trials and reducing the amount of animal testing.

Regulation also requires a level of data security and data provenance. These topics are of interest to the computer industry, and have been addressed during the data science boom of the past two decades.

Advances in bioinformatics—including traceability, deep learning, predictive analytics and collaborative decision-making—have enabled agencies to bring recent drugs to market with a higher degree of confidence of successful therapy and reduced risk. “Bioinformatics platforms are arising that provide decision and process traceability across all variations and changes at both the product and country level,” Tim Moran, Director of Life Science Research Marketing at BIOVIA told R&D Magazine. “Integrated bioinformatics platforms deliver connected and comprehensive regulatory and quality capabilities that accelerate therapeutic approval, production and patient adoption in a global landscape.”