Thursday, July 28, 2011

One-way sound transmission system allows for sound control, energy-harvesting

Researchers at the California Institute of Technology (Caltech) have created the first tunable acoustic diode — a device that allows acoustic information to travel only in one direction, at controllable frequencies.

The researchers used experiments, simulations, and analytical predictions to demonstrate one-way transmission of sound in an audible frequency range for the first time.

This new mechanism brings the idea of true soundproofing closer to reality, the researchers said. This enables someone in room A to hear sound coming from room B; however, it would block the same sound in room A from being heard in room B.

To obtain a sharp transition between transmitting and non-transmitting states, the team created a periodic system with a small defect that supports this kind of quick change from an “on” to an “off” transmission state. The system is very sensitive to small variations of operational conditions, like pressure and movement, making it useful in the development of ultrasensitive acoustic sensors to detect sound waves. The system can also operate at different frequencies of sound and is capable of downshifting, or reducing the frequency of the traveling signals, as needed.

The system is based on a simple assembly of elastic spheres — granular crystals that transmit the sound vibrations — that could be easily used in multiple settings, can be tuned easily, and can potentially be scaled to operate within a wide range of frequencies. Its application could reach far beyond soundproofing, the researchers said.

Potential uses include architectural acoustics for sound control within buildings, biomedical ultrasound devices, advanced noise control, and thermal materials aimed at temperature control.

“We propose to use these effects to improve energy-harvesting technologies,” she says. “For example, we may be able to scavenge sound energy from undesired structural vibrations in machinery by controlling the flow of sound waves away from the machinery and into a transducer. The transducer would then convert the sound waves into electricity.” Daraio says the technology can also shift the undesired frequencies to a range that enables a more efficient conversion to electricity.

Tuesday, July 26, 2011

Researchers identify seventh and eighth bases of DNA

Researchers from the University of North Carolina (UNC) School of Medicine have identified the seventh and eighth bases of DNA.

For decades, scientists have known that DNA consists of four basic units — adenine, guanine, thymine, and cytosine. In recent history, scientists have expanded that list from four to six.

Much is known about the “fifth base,” 5-methylcytosine, which arises when a chemical tag or methyl group is tacked onto a cytosine. This methylation is associated with gene silencing, since it causes the DNA’s double helix to fold even tighter upon itself. Last year, the researchers found that Tet proteins can convert 5 methylC (the fifth base) to 5 hydroxymethylC (the sixth base) in the first of a four-step reaction leading back to cytosine.

However, the researchers could not continue the reaction on to the seventh and eighth bases, called 5 formylC and 5 carboxyC. The problem was that their experimental assay wasn’t sensitive enough. They redesigned it and were able to detect the seventh and eighth bases — called 5-formylcytosine (5fC) and 5 carboxylcytosine (5caC) — which are actually versions of cytosine that have been modified by Tet proteins, molecular entities thought to play a role in DNA demethylation and stem cell reprogramming.

The researchers then examined embryonic stem cells as well as mouse organs and found that both bases can be detected in genomic DNA.

Their findings could have important implications for stem cell research, since it could provide researchers with new tools to erase previous methylation patterns to reprogram adult cells. It could also inform cancer research by giving scientists the opportunity to reactivate tumor suppressor genes that had been silenced by DNA methylation.

Touchscreen keyboard morphs to fit your typing style

Typing on a touchscreen is not one of life's pleasures: the one-size-fits-all nature of most virtual keyboards is a hassle that puts many of us off using them. I've lost count of the number of times I've seen journalists put down an iPad, for instance, and pick up a laptop or netbook to do some serious notetaking or writing.

IBM, however, says it doesn't have to be that way. In a recently filed US patent application, three IBM engineers posit the notion of a virtual keyboard in which the position of the keys and the overall layout is entirely set by the user's finger anatomy. That way, they argue, people will be better able to type at speed, with all keys within comfortable range and so end up, with fewer errors.

After an initial calibration stage, in which the keyboard asks users to undertake a series of exercises to set response time, anatomical algorithms get to work, sensing through the touchscreen the finger skin touch area, finger size and finger position for the logged in user.

As this information is gathered - IBM does not say over what period this learning takes place - the virtual key buttons are automatically resized, reshaped and repositioned in response.

The patent shows a keyboard with some keys subtly higher than others, and with some fatter than others. This "adapts the keyboard to the user's unique typing motion paths" governed by their different physical finger anatomies, says IBM, which suggests the idea being used in both touchscreen and projected "surface computing" displays.

There does seem scope for such ideas. In a review of the Apple iPad, review website MacInTouch said: "A touch typist found it frustratingly glitchy versus a real keyboard, producing all sorts of ghost characters when the screen repeatedly misinterpreted his fingers' intentions."

Perhaps anatomical profiling is just what's needed.