Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/104655
Title: Extracting mode shapes for beams through a passing auxiliary mass
Authors: Zhang, Yao
Zhao, Hai-Sheng
Lie, Seng-Tjhen
Keywords: Mode Shapes
Auxiliary Mass
Engineering::Civil engineering
Issue Date: 2019
Source: Zhang, Y., Zhao, H.-S., & Lie, S.-T. (2019). Extracting mode shapes for beams through a passing auxiliary mass. Journal of Vibration and Acoustics, 141(5), 054501-. doi:10.1115/1.4043542
Series/Report no.: Journal of Vibration and Acoustics
Abstract: This paper shows an approach to evaluate mode shapes for beams through using a passing auxiliary mass. The coupled system of an auxiliary mass passing over a beam is time-dependent, and the corresponding instantaneous frequencies (IFs) are equivalent to the mode shapes. Hence, reconstruction of the mode shapes is easy to be achieved through estimating the IFs. A simple algorithm based on ridge detection is proposed to reconstruct the mode shapes. This method is effective if the beam is light or the lumped mass is heavy. It is convenient since it requires an accelerometer mounted on the passing auxiliary mass rather than a serious of sensors mounted on the structure itself. It is also more practical because it is usually difficult to install external exciter. A lab-scale experimental validation shows that the new technique is capable of identifying the first three mode shapes accurately.
URI: https://hdl.handle.net/10356/104655
http://hdl.handle.net/10220/49505
ISSN: 1048-9002
DOI: 10.1115/1.4043542
Schools: School of Civil and Environmental Engineering 
Rights: © 2019 American Society of Mechanical Engineers. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:CEE Journal Articles

SCOPUSTM   
Citations 50

3
Updated on Mar 14, 2025

Web of ScienceTM
Citations 50

1
Updated on Oct 24, 2023

Page view(s)

297
Updated on Mar 26, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.