Research Interests

Intelligent Sensors Laboratory
Department of Electrical Engineering
Yale University
email: roman.kuc@yale.edu


The Intelligent Sensors Laboratory explores digital signal processing algorithms for extracting information from sensor data. System performance is enhanced by including physical principles governing the sensor and by incorporating prior knowledge. Problems are approached using analysis, simulations and processing of real data. Procedures motivated by biological sensing systems, such as echolocating bats and dolphins, and echolocation by blind humans, are explored and applied to task-oriented problems in robotics.

The typical staffing of the laboratory includes one professor and visiting researchers. Current and past support provided by grants from NSF and industry.

Recent projects include:

  • Audible sonar system that mimics echolocation by the blind.

    A view of the audible sonar head with speaker emitter and a pair of parabolic pinnae that direct sounds to microphones.
    sonar head

    A view of the audible sonar system that provides five degrees of freedom to probe targets.
    sonar head

  • a nonlinear sonar tracking system using wide beam 40 kHz transducers (center transmitter with horizontal pair for azimuth and vertical pair for elevation correction).
    A view of the sensor.
    robat 1
  • an adaptive sonar tracking system using Polaroid 60 kHz transducers (center transmitter is wide beam Polaroid 7000 series and rotating ears are Polaroid instrumentation grade transducers)
    A view of the sensor.
    robat 2
  • an sonar guided camera using Polaroid 60 kHz transducers (two pairs of Polaroid instrumentation grade transducers operation in nonlinear binaural mode)
    A view of the sensor.
    Sona Cam
  • an adaptive sonar acoustic vision system for recognizing objects by their acoustic signatures
    A close view of the biomimetic sonar.
    rodolph
    A full view of the sensor.
    Rodolph
  • Yago – the Yale Go-cart that is instrumented with sonar and infrared
    sensors and controlled with a Pentium-class PC. Yago is being prepared
    for autonomous operation in outdoor environments.
    A front view of Yago.
    Yago 1
    A view of the sensors.
    Yago 2
  • a biologically-motivated sonar system for tracking objects in two and three dimensions,
  • a computer model of the sensorimotor model of the big brown bat Eptesicus fuscus for prey tracking and capture with sonar,
  • an algorithm to image the source current distribution inside the body from magnetic measurements using SQUID detectors,
  • an intelligent wheel chair for use by the blind,
  • a hidden Markov model approach to determining the characteristics of potassium channels in the cell membrane.

 

Recent publications:

R. Kuc & V. Kuc. Modeling human echolocation of near-range targets with an audible sonar. Journal of the Acoustical Society of America, 139(2), 2016, pp. 581-587.

R. Kuc & V. Kuc. Bat wing air pressures may deflect prey structures to provide echo cues for detecting prey in clutter. Journal of the Acoustical Society of America, 132(3), 2012, pp. 1776-1779.

R. Kuc. Echolocation with bat buzz emissions: Model and biomimetic sonar for elevation estimation. Journal of the Acoustical Society of America, 131(1), 2012, pp. 561-568.

R. Kuc. Bat noseleaf model: Echolocation function, design considerations, and experimental verification. Journal of the Acoustical Society of America, 129(5), 2011, pp. 3361-3366.

F.J. Alvarez, R. Kuc and T. Aguilera. Identifying fabrics with a variable emission airborne spiking sonar. IEEE Sensors Journal. IEEE Sensors Journal, 11(9), 2011, pp. 1905-1912.

R. Kuc. Morphology suggests noseleaf and pinnae cooperate to enhance bat echolocation. Journal of the Acoustical Society of America, 128(5), Nov 2010, pp. 3190-3199.