Engineers combine light and sound to see underwater: ScienceDaily



[ad_1]

Stanford University engineers have developed an aerial method for imaging underwater objects by combining light and sound to break through the seemingly impassable barrier at the interface between air and water.

The researchers envision their optical-acoustic hybrid system will one day be used to conduct drone-based biological marine surveys from the air, perform large-scale aerial searches of sunken ships and planes, and map ocean depths with speed and speed. similar level of detail as the landscapes of the Earth. Their “Photoacoustic Airborne Sonar System” is detailed in a recent study published in the journal IEEE access.

“Airborne and space radar and laser systems, or LIDARs, have been able to map Earth’s landscapes for decades. Radar signals are even capable of penetrating cloud cover and canopy cover. However, seawater is seawater. it is too absorbent for imaging in water, ”said study leader Amin Arbabian, associate professor of electrical engineering at Stanford’s School of Engineering. “Our goal is to develop a more robust system that can reproduce images even through murky waters.”

Subtitle: loss of energy

Oceans cover about 70 percent of the earth’s surface, but only a small portion of their depths have been subjected to high-resolution imaging and mapping.

The main barrier has to do with physics: sound waves, for example, cannot pass from air to water or vice versa without losing most – more than 99.9 percent – of their energy through reflection on the other means. A system that tries to see underwater using sound waves that travel from air to water and back into the air is prone to this energy loss twice, resulting in a 99.9999% reduction in energy.

Likewise, electromagnetic radiation – a generic term that includes light, microwave and radar signals – also loses energy as it passes from one physical medium to another, although the mechanism is different from that of sound. “Light also loses some energy from reflection, but most of the energy loss is due to water absorption,” explained study first author Aidan Fitzpatrick, a Stanford graduate student in engineering. electric. Incidentally, this absorption is also why sunlight cannot penetrate the depths of the ocean and why your smartphone – which relies on cellular signals, a form of electromagnetic radiation – cannot receive calls under water.

The result of all of this is that the oceans cannot be mapped from air and space in the same way that the earth can. To date, most underwater maps have been obtained by connecting sonar systems to vessels fishing in a given region of interest. But this technique is slow, expensive and inefficient for covering large areas.

Subtitle: An invisible puzzle

Enter the Photoacoustic Airborne Sonar System (PASS), which combines light and sound to break through the air-water interface. The idea came from another project that used microwaves to perform ‘contactless’ imaging and characterization of underground plant roots. Some of the PASS instruments were initially designed for this purpose in collaboration with the laboratory of Stanford electrical engineering professor Butrus Khuri-Yakub.

At its heart, PASS plays on the individual strengths of light and sound. “If we can use light in the air, where light travels well, and sound in water, where sound travels well, we can get the best of both worlds,” Fitzpatrick said.

To do this, the system first fires a laser from the air which is absorbed by the water surface. When the laser is absorbed, it generates ultrasonic waves that travel along the water column and reflect off underwater objects before running back to the surface.

The returning sound waves are still devoid of most of their energy when they breach the surface of the water, but by generating the sound waves underwater with lasers, researchers can prevent the energy loss from happening twice.

“We have developed a system that is sensitive enough to compensate for a loss of this magnitude and still allow for signal detection and imaging,” said Arbabian.

The reflected ultrasonic waves are recorded by instruments called transducers. The software is then used to reconstruct the acoustic signals as an invisible puzzle and reconstruct a three-dimensional image of the submerged feature or object.

“Similar to how light refracts or ‘bends’ when it passes through water or any medium denser than air, ultrasound also refracts,” Arbabian explained. “Our image reconstruction algorithms correct this deflection that occurs when ultrasonic waves pass from water into air.”

Subtitle: ocean surveys with drones

Conventional sonar systems can penetrate to depths of hundreds to thousands of meters, and researchers expect their system will eventually be able to reach similar depths.

To date, PASS has only been laboratory tested in a container the size of a large aquarium. “Current experiments use static water, but we are currently working to tackle the water waves,” said Fitzpatrick. “This is a challenging problem but we believe it is feasible.”

The next step, the researchers say, will be to conduct tests in a larger environment and, ultimately, in an offshore environment.

“Our vision for this technology is aboard a helicopter or a drone,” said Fitzpatrick. “We expect the system to be able to fly tens of meters above water.”

Watch the video: https://youtu.be/2YyAnxQkeuk

.

[ad_2]
Source link