Publications by authors named "Ilyas Potamitis"

10 Publications

  • Page 1 of 1

Edge Computing for Vision-Based, Urban-Insects Traps in the Context of Smart Cities.

Sensors (Basel) 2022 Mar 4;22(5). Epub 2022 Mar 4.

Department of Electronic Engineering, Hellenic Mediterranean University, 73133 Chania, Greece.

Our aim is to promote the widespread use of electronic insect traps that report captured pests to a human-controlled agency. This work reports on edge-computing as applied to camera-based insect traps. We present a low-cost device with high power autonomy and an adequate picture quality that reports an internal image of the trap to a server and counts the insects it contains based on quantized and embedded deep-learning models. The paper compares different aspects of performance of three different edge devices, namely ESP32, Raspberry Pi Model 4 (RPi), and Google Coral, running a deep learning framework (TensorFlow Lite). All edge devices were able to process images and report accuracy in counting exceeding 95%, but at different rates and power consumption. Our findings suggest that ESP32 appears to be the best choice in the context of this application according to our policy for low-cost devices.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/s22052006DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8914644PMC
March 2022

Resolving the identification of weak-flying insects during flight: a coupling between rigorous data processing and biology.

Agric For Entomol 2021 Nov 2;23(4):489-505. Epub 2021 Jun 2.

Rothamsted Insect Survey Rothamsted Research West Common, Harpenden AL5 2JQ U.K.

Bioacoustic methods play an increasingly important role for the detection of insects in a range of surveillance and monitoring programmes.Weak-flying insects evade detection because they do not yield sufficient audio information to capture wingbeat and harmonic frequencies. These inaudible insects often pose a significant threat to food security as pests of key agricultural crops worldwide.Automatic detection of such insects is crucial to the future of crop protection by providing critical information to assess the risk to a crop and the need for preventative measures.We describe an experimental set-up designed to derive audio recordings from a range of weak-flying aphids and beetles using an LED array.A rigorous data processing pipeline was developed to extract meaningful features, linked to morphological characteristics, from the audio and harmonic series for six aphid and two beetle species.An ensemble of over 50 bioacoustic parameters was used to achieve species discrimination with a success rate of 80%. The inclusion of the dominant and fundamental frequencies improved prediction between beetles and aphids because of large differences in wingbeat frequencies.At the species level, error rates were minimized when harmonic features were supplemented by features indicative of differences in species' flight energies.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/afe.12453DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8596709PMC
November 2021

In-Vivo Vibroacoustic Surveillance of Trees in the Context of the IoT.

Sensors (Basel) 2019 Mar 19;19(6). Epub 2019 Mar 19.

Department of Electrical and Electronics Engineering, University of West Attica, 12241 Athens, Greece.

This work introduces a device for long term systematic monitoring of trees against borers. A widely applied way to detect wood-boring insects is to insert a piezoelectric probe with an uncoated waveguide in the tree trunk and listen for locomotion or feeding sounds through headphones. This approach has several shortcomings: (a) frequent manual inspection of trees is costly and impractical to scale to hundreds or thousands of trees, (b) the larvae could be present but inactive during the inspection time and, (c) when the trees are in urban environments the background noise can be significant and can mask the feeble sounds of wood-boring insects even with the use of specialized headphones. We introduce a remotely controlled device that records and wirelessly transmits on a scheduled basis short recordings of the internal vibrations of a tree to a server. The user can listen remotely or process the recording automatically to infer the infestation state of the tree with wood-boring insects that feed or move inside the tree. When integrated within the IoT framework this device can scale up to automatically monitoring the trees of the entire city. The proposed approach led to detection results in field trials of the pests (Chevrolat) (Cerambycidae) and Olivier (Coleoptera: Curculionidae).
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/s19061366DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6471019PMC
March 2019

Transfer Learning for Improved Audio-Based Human Activity Recognition.

Biosensors (Basel) 2018 Jun 25;8(3). Epub 2018 Jun 25.

Technological Educational Institute of Crete, E. Daskalaki, Perivolia, 74100, Rethymno, Greece.

Human activities are accompanied by characteristic sound events, the processing of which might provide valuable information for automated human activity recognition. This paper presents a novel approach addressing the case where one or more human activities are associated with limited audio data, resulting in a potentially highly imbalanced dataset. Data augmentation is based on transfer learning; more specifically, the proposed method: (a) identifies the classes which are statistically close to the ones associated with limited data; (b) learns a multiple input, multiple output transformation; and (c) transforms the data of the closest classes so that it can be used for modeling the ones associated with limited data. Furthermore, the proposed framework includes a feature set extracted out of signal representations of diverse domains, i.e., temporal, spectral, and wavelet. Extensive experiments demonstrate the relevance of the proposed data augmentation approach under a variety of generative recognition schemes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/bios8030060DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6163773PMC
June 2018

Automated Surveillance of Fruit Flies.

Sensors (Basel) 2017 Jan 8;17(1). Epub 2017 Jan 8.

Department of Electronics Engineering, Piraeus University of Applied Sciences, Athens 12244, Greece.

Insects of the Diptera order of the Tephritidae family cause costly, annual crop losses worldwide. Monitoring traps are important components of integrated pest management programs used against fruit flies. Here we report the modification of typical, low-cost plastic traps for fruit flies by adding the necessary optoelectronic sensors to monitor the entrance of the trap in order to detect, time-stamp, GPS tag, and identify the species of incoming insects from the optoacoustic spectrum analysis of their wingbeat. We propose that the incorporation of automated streaming of insect counts, environmental parameters and GPS coordinates into informative visualization of collective behavior will finally enable better decision making across spatial and temporal scales, as well as administrative levels. The device presented is at product level of maturity as it has solved many pending issues presented in a previously reported study.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/s17010110DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5298683PMC
January 2017

Insect Biometrics: Optoacoustic Signal Processing and Its Applications to Remote Monitoring of McPhail Type Traps.

PLoS One 2015 6;10(11):e0140474. Epub 2015 Nov 6.

Department of Electronic and Computer Engineering, Technical University of Crete, Kounoupidiana, Chania, 73100, Greece.

Monitoring traps are important components of integrated pest management applied against important fruit fly pests, including Bactrocera oleae (Gmelin) and Ceratitis capitata (Widemann), Diptera of the Tephritidae family, which effect a crop-loss/per year calculated in billions of euros worldwide. Pests can be controlled with ground pesticide sprays, the efficiency of which depends on knowing the time, location and extent of infestations as early as possible. Trap inspection is currently carried out manually, using the McPhail trap, and the mass spraying is decided based on a decision protocol. We introduce the term 'insect biometrics' in the context of entomology as a measure of a characteristic of the insect (in our case, the spectrum of its wingbeat) that allows us to identify its species and make devices to help face old enemies with modern means. We modify a McPhail type trap into becoming electronic by installing an array of photoreceptors coupled to an infrared emitter, guarding the entrance of the trap. The beating wings of insects flying in the trap intercept the light and the light fluctuation is turned to a recording. Custom-made electronics are developed that are placed as an external add-on kit, without altering the internal space of the trap. Counts from the trap are transmitted using a mobile communication network. This trap introduces a new automated remote-monitoring method different to audio and vision-based systems. We evaluate our trap in large number of insects in the laboratory by enclosing the electronic trap in insectary cages. Our experiments assess the potential of delivering reliable data that can be used to initialize reliably the spraying process at large scales but to also monitor the impact of the spraying process as it eliminates the time-lag between acquiring and delivering insect counts to a central agency.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0140474PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4636391PMC
June 2016

The electronic McPhail trap.

Sensors (Basel) 2014 Nov 25;14(12):22285-99. Epub 2014 Nov 25.

Department of Electrical and Computer Engineering, Technical University of Crete, Kounoupidiana, Chania 73100, Greece.

Certain insects affect cultivations in a detrimental way. A notable case is the olive fruit fly (Bactrocera oleae (Rossi)), that in Europe alone causes billions of euros in crop-loss/per year. Pests can be controlled with aerial and ground bait pesticide sprays, the efficiency of which depends on knowing the time and location of insect infestations as early as possible. The inspection of traps is currently carried out manually. Automatic monitoring traps can enhance efficient monitoring of flying pests by identifying and counting targeted pests as they enter the trap. This work deals with the hardware setup of an insect trap with an embedded optoelectronic sensor that automatically records insects as they fly in the trap. The sensor responsible for detecting the insect is an array of phototransistors receiving light from an infrared LED. The wing-beat recording is based on the interruption of the emitted light due to the partial occlusion from insect's wings as they fly in the trap. We show that the recordings are of high quality paving the way for automatic recognition and transmission of insect detections from the field to a smartphone. This work emphasizes the hardware implementation of the sensor and the detection/counting module giving all necessary implementation details needed to construct it.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/s141222285DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4299014PMC
November 2014

Automatic classification of a taxon-rich community recorded in the wild.

Authors:
Ilyas Potamitis

PLoS One 2014 14;9(5):e96936. Epub 2014 May 14.

Technological Educational Institute of Crete, Department of Music Technology and Acoustics, Crete, Greece.

There is a rich literature on automatic species identification of a specific target taxon as regards various vocalizing animals. Research usually is restricted to specific species--in most cases a single one. It is only very recently that the number of monitored species has started to increase for certain habitats involving birds. Automatic acoustic monitoring has not yet been proven to be generic enough to scale to other taxa and habitats than the ones described in the original research. Although attracting much attention, the acoustic monitoring procedure is neither well established yet nor universally adopted as a biodiversity monitoring tool. Recently, the multi-instance multi-label framework on bird vocalizations has been introduced to face the obstacle of simultaneously vocalizing birds of different species. We build on this framework to integrate novel, image-based heterogeneous features designed to capture different aspects of the spectrum. We applied our approach to a taxon-rich habitat that included 78 birds, 8 insect species and 1 amphibian. This dataset constituted the Multi-label Bird Species Classification Challenge-NIPS 2013 where the proposed approach achieved an average accuracy of 91.25% on unseen data.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0096936PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4020809PMC
December 2014

On automatic bioacoustic detection of pests: the cases of Rhynchophorus ferrugineus and Sitophilus oryzae.

J Econ Entomol 2009 Aug;102(4):1681-90

Department of Music Technology and Acoustics, Technological Educational Institute of Crete, Daskalaki-Perivolia, 74100 Rethymno, Greece.

The present work reports research efforts toward development and evaluation of a unified framework for automatic bioacoustic recognition of specific insect pests. Our approach is based on capturing and automatically recognizing the acoustic emission resulting from typical behaviors, e.g., locomotion and feeding, of the target pests. After acquisition the signals are amplified, filtered, parameterized, and classified by advanced machine learning methods on a portable computer. Specifically, we investigate an advanced signal parameterization scheme that relies on variable size signal segmentation. The feature vector computed for each segment of the signal is composed of the dominant harmonic, which carry information about the periodicity of the signal, and the cepstral coefficients, which carry information about the relative distribution of energy among the different spectral sub-bands. This parameterization offers a reliable representation of both the acoustic emissions of the pests of interest and the interferences from the environment. We illustrate the practical significance of our methodology on two specific cases: 1) a devastating pest for palm plantations, namely, Rhynchophorus ferrugineus Olivier and 2) a pest that attacks warehouse stored rice (Oryza sativa L.), the rice weevil, Sitophilus oryzae (L.) (both Coleoptera: Curculionidae, Dryophorinae). These pests are known in many countries around the world and contribute for significant economical loss. The proposed approach led to detection results in real field trials, reaching 99.1% on real-field recordings of R. ferrugineus and 100% for S. oryzae.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1603/029.102.0436DOI Listing
August 2009

Speech activity detection and enhancement of a moving speaker based on the wideband generalized likelihood ratio and microphone arrays.

J Acoust Soc Am 2004 Oct;116(4 Pt 1):2406-15

Wire Communications Laboratory, Electrical and Computer Engineering Department, University of Patras, Sofocleous-Adiparou 1, 265 00 Rion, Patras, Greece.

The subject of this work is a unifying treatment of estimating the Direction of Arrival (DOA), detecting speech activity and suppressing noise in the case of a moving speaker by using a linear microphone array. The approach is based on the generalized likelihood ratio test applied to the framework of far-field, wideband moving sources (W-GLRT). It is shown that under certain distributional assumptions the W-GLRT provides a framework for the evaluation of DOA measurements against spurious DOAs, probabilistic speech activity detection as well as speech enhancement. As regards speech enhancement, we demonstrate the direct connection of W-GLRT with enhancement based on subspace methods. In addition, through the concept of directive a priori SNR we demonstrate its indirect connection with Minimum Mean Square Error spectral (MMSE_SA) and log-spectral gain modification (MMSE_LSA). The efficiency of the approach is illustrated on a moving speaker when either additive white Gaussian or babble noise is present in the acoustical field at very low SNRs.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1121/1.1781622DOI Listing
October 2004
-->