|
|
|
|
LEADER |
00000nam a2200000Mi 4500 |
001 |
b3597611 |
003 |
CStclU |
005 |
20191219152940.0 |
006 |
m o d |
007 |
cr ||||||||||| |
008 |
191027s2019 xx o 000 0 eng d |
020 |
|
|
|a 1119544637
|q (electronic bk.)
|
020 |
|
|
|a 9781119544630
|q (electronic bk.)
|
035 |
|
|
|a (NhCcYBP)ebc5964413
|
040 |
|
|
|a NhCcYBP
|c NhCcYBP
|
100 |
1 |
|
|a NANDI, ASOKE K. AHMED, HOSAMELDIN.
|
245 |
1 |
0 |
|a CONDITION MONITORING WITH VIBRATION SIGNALS :
|b compressive sampling and learning algorithms ... for rotating machines.
|
260 |
|
|
|a HOBOKEN :
|b WILEY-BLACKWELL,
|c 2019.
|
300 |
|
|
|a 1 online resource.
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
505 |
0 |
0 |
|a Machine generated contents note:
|g pt. I
|t Introduction --
|g 1.
|t Introduction to Machine Condition Monitoring --
|g 1.1.
|t Background --
|g 1.2.
|t Maintenance Approaches for Rotating Machines Failures --
|g 1.2.1.
|t Corrective Maintenance --
|g 1.2.2.
|t Preventive Maintenance --
|g 1.2.2.1.
|t Time-Based Maintenance (TBM) --
|g 1.2.2.2.
|t Condition-Based Maintenance (CBM) --
|g 1.3.
|t Applications of MCM --
|g 1.3.1.
|t Wind Turbines --
|g 1.3.2.
|t Oil and Gas --
|g 1.3.3.
|t Aerospace and Defence Industry --
|g 1.3.4.
|t Automotive --
|g 1.3.5.
|t Marine Engines --
|g 1.3.6.
|t Locomotives --
|g 1.4.
|t Condition Monitoring Techniques --
|g 1.4.1.
|t Vibration Monitoring --
|g 1.4.2.
|t Acoustic Emission --
|g 1.4.3.
|t Fusion of Vibration and Acoustic --
|g 1.4.4.
|t Motor Current Monitoring --
|g 1.4.5.
|t Oil Analysis and Lubrication Monitoring --
|g 1.4.6.
|t Thermography --
|g 1.4.7.
|t Visual Inspection --
|g 1.4.8.
|t Performance Monitoring --
|g 1.4.9.
|t Trend Monitoring --
|g 1.5.
|t Topic Overview and Scope of the Book --
|g 1.6.
|t Summary --
|t References --
|g 2.
|t Principles of Rotating Machine Vibration Signals --
|g 2.1.
|t Introduction --
|g 2.2.
|t Machine Vibration Principles --
|g 2.3.
|t Sources of Rotating Machines Vibration Signals --
|g 2.3.1.
|t Rotor Mass Unbalance --
|g 2.3.2.
|t Misalignment --
|g 2.3.3.
|t Cracked Shafts --
|g 2.3.4.
|t Rolling Element Bearings --
|g 2.3.5.
|t Gears --
|g 2.4.
|t Types of Vibration Signals --
|g 2.4.1.
|t Stationary --
|g 2.4.2.
|t Nonstationary --
|g 2.5.
|t Vibration Signal Acquisition --
|g 2.5.1.
|t Displacement Transducers --
|g 2.5.2.
|t Velocity Transducers --
|g 2.5.3.
|t Accelerometers --
|g 2.6.
|t Advantages and Limitations of Vibration Signal Monitoring --
|g 2.7.
|t Summary --
|t References --
|g pt. II
|t Vibration Signal Analysis Techniques --
|g 3.
|t Time Domain Analysis --
|g 3.1.
|t Introduction --
|g 3.1.1.
|t Visual Inspection --
|g 3.1.2.
|t Features-Based Inspection --
|g 3.2.
|t Statistical Functions --
|g 3.2.1.
|t Peak Amplitude --
|g 3.2.2.
|t Mean Amplitude --
|g 3.2.3.
|t Root Mean Square Amplitude --
|g 3.2.4.
|t Peak-to-Peak Amplitude --
|g 3.2.5.
|t Crest Factor (CF) --
|g 3.2.6.
|t Variance and Standard Deviation --
|g 3.2.7.
|t Standard Error --
|g 3.2.8.
|t Zero Crossing --
|g 3.2.9.
|t Wavelength --
|g 3.2.10.
|t Willison Amplitude --
|g 3.2.11.
|t Slope Sign Change --
|g 3.2.12.
|t Impulse Factor --
|g 3.2.13.
|t Margin Factor --
|g 3.2.14.
|t Shape Factor --
|g 3.2.15.
|t Clearance Factor --
|g 3.2.16.
|t Skewness --
|g 3.2.17.
|t Kurtosis --
|g 3.2.18.
|t Higher-Order Cumulants (HOCs) --
|g 3.2.19.
|t Histograms --
|g 3.2.20.
|t Normal/Weibull Negative Log-Likelihood Value --
|g 3.2.21.
|t Entropy --
|g 3.3.
|t Time Synchronous Averaging --
|g 3.3.1.
|t TSA Signals --
|g 3.3.2.
|t Residual Signal (RES) --
|g 3.3.2.1.
|t NA4 --
|g 3.3.2.2.
|t NA4* --
|g 3.3.3.
|t Difference Signal (DIFS) --
|g 3.3.3.1.
|t FM4 --
|g 3.3.3.2.
|t M6A --
|g 3.3.3.3.
|t M8A --
|g 3.4.
|t Time Series Regressive Models --
|g 3.4.1.
|t AR Model --
|g 3.4.2.
|t MA Model --
|g 3.4.3.
|t ARMA Model --
|g 3.4.4.
|t ARIMA Model --
|g 3.5.
|t Filter-Based Methods --
|g 3.5.1.
|t Demodulation --
|g 3.5.2.
|t Prony Model --
|g 3.5.3.
|t Adaptive Noise Cancellation (ANC) --
|g 3.6.
|t Stochastic Parameter Techniques --
|g 3.7.
|t Blind Source Separation (BSS) --
|g 3.8.
|t Summary --
|t References --
|g 4.
|t Frequency Domain Analysis --
|g 4.1.
|t Introduction --
|g 4.2.
|t Fourier Analysis --
|g 4.2.1.
|t Fourier Series --
|g 4.2.2.
|t Discrete Fourier Transform --
|g 4.2.3.
|t Fast Fourier Transform (FFT) --
|g 4.3.
|t Envelope Analysis --
|g 4.4.
|t Frequency Spectrum Statistical Features --
|g 4.4.1.
|t Arithmetic Mean --
|g 4.4.2.
|t Geometric Mean --
|g 4.4.3.
|t Matched Filter RMS --
|g 4.4.4.
|t RMS of Spectral Difference --
|g 4.4.5.
|t Sum of Squares Spectral Difference --
|g 4.4.6.
|t High-Order Spectra Techniques --
|g 4.5.
|t Summary --
|t References --
|g 5.
|t Time-Frequency Domain Analysis --
|g 5.1.
|t Introduction --
|g 5.2.
|t Short-Time Fourier Transform (STFT) --
|g 5.3.
|t Wavelet Analysis --
|g 5.3.1.
|t Wavelet Transform (WT) --
|g 5.3.1.1.
|t Continuous Wavelet Transform (CWT) --
|g 5.3.1.2.
|t Discrete Wavelet Transform (DWT) --
|g 5.3.2.
|t Wavelet Packet Transform (WPT) --
|g 5.4.
|t Empirical Mode Decomposition (EMD) --
|g 5.5.
|t Hilbert-Huang Transform (HHT) --
|g 5.6.
|t Wigner-Ville Distribution --
|g 5.7.
|t Local Mean Decomposition (LMD) --
|g 5.8.
|t Kurtosis and Kurtograms --
|g 5.9.
|t Summary --
|t References --
|g pt. III
|t Rotating Machine Condition Monitoring Using Machine Learning --
|g 6.
|t Vibration-Based Condition Monitoring Using Machine Learning --
|g 6.1.
|t Introduction --
|g 6.2.
|t Overview of the Vibration-Based MCM Process --
|g 6.2.1.
|t Fault-Detection and Diagnosis Problem Framework --
|g 6.3.
|t Learning from Vibration Data --
|g 6.3.1.
|t Types of Learning --
|g 6.3.1.1.
|t Batch vs. Online Learning --
|g 6.3.1.2.
|t Instance-Based vs. Model-Based Learning --
|g 6.3.1.3.
|t Supervised Learning vs. Unsupervised Learning --
|g 6.3.1.4.
|t Semi-Supervised Learning --
|g 6.3.1.5.
|t Reinforcement Learning --
|g 6.3.1.6.
|t Transfer Learning --
|g 6.3.2.
|t Main Challenges of Learning from Vibration Data --
|g 6.3.2.1.
|t Curse of Dimensionality --
|g 6.3.2.2.
|t Irrelevant Features --
|g 6.3.2.3.
|t Environment and Operating Conditions of a Rotating Machine --
|g 6.3.3.
|t Preparing Vibration Data for Analysis --
|g 6.3.3.1.
|t Normalisation --
|g 6.3.3.2.
|t Dimensionality Reduction --
|g 6.4.
|t Summary --
|t References --
|g 7.
|t Linear Subspace Learning --
|g 7.1.
|t Introduction --
|g 7.2.
|t Principal Component Analysis (PCA) --
|g 7.2.1.
|t PCA Using Eigenvector Decomposition --
|g 7.2.2.
|t PCA Using SVD --
|g 7.2.3.
|t Application of PCA in Machine Fault Diagnosis --
|g 7.3.
|t Independent Component Analysis (ICA) --
|g 7.3.1.
|t Minimisation of Mutual Information --
|g 7.3.2.
|t Maximisation of the Likelihood --
|g 73.3.
|t Application of ICA in Machine Fault Diagnosis --
|g 7.4.
|t Linear Discriminant Analysis (LDA) --
|g 7.4.1.
|t Application of LDA in Machine Fault Diagnosis --
|g 7.5.
|t Canonical Correlation Analysis (CCA) --
|g 7.6.
|t Partial Least Squares (PLS) --
|g 7.7.
|t Summary --
|t References --
|g 8.
|t Nonlinear Subspace Learning --
|g 8.1.
|t Introduction --
|g 8.2.
|t Kernel Principal Component Analysis (KPCA) --
|g 8.2.1.
|t Application of KPCA in Machine Fault Diagnosis --
|g 8.3.
|t Isometric Feature Mapping (ISOMAP) --
|g 8.3.1.
|t Application of ISOMAP in Machine Fault Diagnosis --
|g 8.4.
|t Diffusion Maps (DMs) and Diffusion Distances --
|g 8.4.1.
|t Application of DMs in Machine Fault Diagnosis --
|g 8.5.
|t Laplacian Eigenmap (LE) --
|g 8.5.1.
|t Application of the LE in Machine Fault Diagnosis --
|g 8.6.
|t Local Linear Embedding (LLE) --
|g 8.6.1.
|t Application of LLE in Machine Fault Diagnosis --
|g 8.7.
|t Hessian-Based LLE --
|g 8.7.1.
|t Application of HLLE in Machine Fault Diagnosis --
|g 8.8.
|t Local Tangent Space Alignment Analysis (LTSA) --
|g 8.8.1.
|t Application of LTSA in Machine Fault Diagnosis --
|g 8.9.
|t Maximum Variance Unfolding (MVU) --
|g 8.9.1.
|t Application of MVU in Machine Fault Diagnosis --
|g 8.10.
|t Stochastic Proximity Embedding (SPE) --
|g 8.10.1.
|t Application of SPE in Machine Fault Diagnosis --
|g 8.11.
|t Summary --
|t References --
|g 9.
|t Feature Selection --
|g 9.1.
|t Introduction --
|g 9.2.
|t Filter Model-Based Feature Selection --
|g 9.2.1.
|t Fisher Score (FS) --
|g 9.2.2.
|t Laplacian Score (LS) --
|g 9.2.3.
|t Relief and Relief-F Algorithms --
|g 9.2.3.1.
|t Relief Algorithm --
|g 9.2.3.2.
|t Relief-F Algorithm --
|g 9.2.4.
|t Pearson Correlation Coefficient (PCC) --
|g 9.2.5.
|t Information Gain (IG) and Gain Ratio (GR) --
|g 9.2.6.
|t Mutual Information (MI) --
|g 9.2.7.
|t Chi-Squared (Chi-2) --
|g 9.2.8.
|t Wilcoxon Ranking --
|g 9.2.9.
|t Application of Feature Ranking in Machine Fault Diagnosis --
|g 9.3.
|t Wrapper Model-Based Feature Subset Selection --
|g 9.3.1.
|t Sequential Selection Algorithms --
|g 9.3.2.
|t Heuristic-Based Selection Algorithms --
|g 9.3.2.1.
|t Ant Colony Optimisation (ACO) --
|g 9.3.2.2.
|t Genetic Algorithms (GAs) and Genetic Programming --
|g 9.3.2.3.
|t Particle Swarm Optimisation (PSO) --
|g 9.3.3.
|t Application of Wrapper Model-Based Feature Subset Selection in Machine Fault Diagnosis --
|g 9.4.
|t Embedded Model-Based Feature Selection --
|g 9.5.
|t Summary --
|t References --
|g pt. IV
|t Classification Algorithms --
|g 10.
|t Decision Trees and Random Forests --
|g 10.1.
|t Introduction --
|g 10.2.
|t Decision Trees --
|g 10.2.1.
|t Univariate Splitting Criteria --
|g 10.2.1.1.
|t Gini Index --
|g 10.2.1.2.
|t Information Gain --
|g 10.2.1.3.
|t Distance Measure --
|g 10.2.1.4.
|t Orthogonal Criterion (ORT) --
|g 10.2.2.
|t Multivariate Splitting Criteria --
|g 10.2.3.
|t Tree-Pruning Methods --
|g 10.2.3.1.
|t Error-Complexity Pruning --
|g 10.2.3.2.
|t Minimum-Error Pruning --
|g 10.2.3.3.
|t Reduced-Error Pruning --
|g 10.2.3.4.
|t Critical-Value Pruning --
|g 10.2.3.5.
|t Pessimistic Pruning --
|g 10.2.3.6.
|t Minimum Description Length (MDL) Pruning --
|g 10.2.4.
|t Decision Tree Inducers --
|g 10.2.4.1.
|t CART --
|g 10.2.4.2.
|t ID3 --
|g 10.2.4.3.
|t C4.5 --
|g 10.2.4.4.
|t CHAID --
|g 10.3.
|t Decision Forests --
|g 10.4.
|t Application of Decision Trees/Forests in Machine Fault Diagnosis --
|g 10.5.
|t Summary --
|t References --
|g 11.
|t Probabilistic Classification Methods --
|g 11.1.
|t Introduction --
|g 11.2.
|t Hidden Markov Model --
|g 11.2.1.
|t Application of Hidden Markov Models in Machine Fault Diagnosis --
|g 11.3.
|t Logistic Regression Model --
|g 11.3.1.
|t Logistic Regression Regularisation --
|g 11.3.2.
|t Multinomial Logistic Regression Model (MLR) --
|g 11.3.3.
|t Application of Logistic Regression in Machine Fault Diagnosis --
|g 11.4.
|t Summary --
|t References --
|g 12.
|t Artificial Neural Networks (ANNs) --
|g 12.1.
|t Introduction --
|g 12.2.
|t Neural Network Basic Principles --
|g 12.2.1.
|t Multilayer Perceptron --
|g 12.2.2.
|t Radial Basis Function Network --
|g 12.2.3.
|t Kohonen Network --
|g 12.3.
|t Application of Artificial Neural Networks in Machine Fault Diagnosis --
|g 12.4.
|t Summary --
|t References --
|g 13.
|t Support Vector Machines (SVMs) --
|g 13.1.
|t Introduction --
|g 13.2.
|t Multiclass SVMs --
|g 13.3.
|t Selection of Kernel Parameters --
|g 13.4.
|t Application of SVMs in Machine Fault Diagnosis --
|g 13.5.
|t Summary --
|t References --
|g 14.
|t Deep Learning --
|g 14.1.
|t Introduction --
|g 14.2.
|t Autoencoders --
|
505 |
0 |
0 |
|a Contents note continued:
|g 14.3.
|t Convolutional Neural Networks (CNNs) --
|g 14.4.
|t Deep Belief Networks (DBNs) --
|g 14.5.
|t Recurrent Neural Networks (RNNs) --
|g 14.6.
|t Overview of Deep Learning in MCM --
|g 14.6.1.
|t Application of AE-based DNNs in Machine Fault Diagnosis --
|g 14.6.2.
|t Application of CNNs in Machine Fault Diagnosis --
|g 14.6.3.
|t Application of DBNs in Machine Fault Diagnosis --
|g 14.6.4.
|t Application of RNNs in Machine Fault Diagnosis --
|g 14.7.
|t Summary --
|t References --
|g 15.
|t Classification Algorithm Validation --
|g 15.1.
|t Introduction --
|g 15.2.
|t Hold-Out Technique --
|g 15.2.1.
|t Three-Way Data Split --
|g 15.3.
|t Random Subsampling --
|g 15.4.
|t K-Fold Cross-Validation --
|g 15.5.
|t Leave-One-Out Cross-Validation --
|g 15.6.
|t Bootstrapping --
|g 15.7.
|t Overall Classification Accuracy --
|g 15.8.
|t Confusion Matrix --
|g 15.9.
|t Recall and Precision --
|g 15.10.
|t ROC Graphs --
|g 15.11.
|t Summary --
|t References --
|g pt. V
|t New Fault Diagnosis Frameworks Designed for MCM --
|g 16.
|t Compressive Sampling and Subspace Learning (CS-SL) --
|g 16.1.
|t Introduction --
|g 16.2.
|t Compressive Sampling for Vibration-Based MCM --
|g 16.2.1.
|t Compressive Sampling Basics --
|g 16.2.2.
|t CS for Sparse Frequency Representation --
|g 16.2.3.
|t CS for Sparse Time-Frequency Representation --
|g 16.3.
|t Overview of CS in Machine Condition Monitoring --
|g 16.3.1.
|t Compressed Sensed Data Followed by Complete Data Construction --
|g 16.3.2.
|t Compressed Sensed Data Followed by Incomplete Data Construction --
|g 16.3.3.
|t Compressed Sensed Data as the Input of a Classifier --
|g 16.3.4.
|t Compressed Sensed Data Followed by Feature Learning --
|g 16.4.
|t Compressive Sampling and Feature Ranking (CS-FR) --
|g 16.4.1.
|t Implementations --
|g 16.4.1.1.
|t CS-LS --
|g 16.4.1.2.
|t CS-FS --
|g 16.4.1.3.
|t CS-Relief-F --
|g 16.4.1.4.
|t CS-PCC --
|g 16.4.1.5.
|t CS-Chi-2 --
|g 16.5.
|t CS and Linear Subspace Learning-Based Framework for Fault Diagnosis --
|g 16.5.1.
|t Implementations --
|g 16.5.1.1.
|t CS-PCA --
|g 16.5.1.2.
|t CS-LDA --
|g 16.5.1.3.
|t CS-CPDC --
|g 16.6.
|t CS and Nonlinear Subspace Learning-Based Framework for Fault Diagnosis --
|g 16.6.1.
|t Implementations --
|g 16.6.1.1.
|t CS-KPCA --
|g 16.6.1.2.
|t CS-KLDA --
|g 16.6.1.3.
|t CS-CMDS --
|g 16.6.1.4.
|t CS-SPE --
|g 16.7.
|t Applications --
|g 16.7.1.
|t Case Study 1 --
|g 16.7.1.1.
|t Combination of MMV-CS and Several Feature-Ranking Techniques --
|g 16.7.1.2.
|t Combination of MMV-CS and Several Linear and Nonlinear Subspace Learning Techniques --
|g 16.7.2.
|t Case Study 2 --
|g 16.7.2.1.
|t Combination of MMV-CS and Several Feature-Ranking Techniques --
|g 16.7.2.2.
|t Combination of MMV-CS and Several Linear and Nonlinear Subspace Learning Techniques --
|g 16.8.
|t Discussion --
|t References --
|g 17.
|t Compressive Sampling and Deep Neural Network (CS-DNN) --
|g 17.1.
|t Introduction --
|g 17.2.
|t Related Work --
|g 17.3.
|t CS-SAE-DNN --
|g 17.3.1.
|t Compressed Measurements Generation --
|g 17.3.2.
|t CS Model Testing Using the Flip Test --
|g 17.3.3.
|t DNN-Based Unsupervised Sparse Overcomplete Feature Learning --
|g 17.3.4.
|t Supervised Fine Tuning --
|g 17.4.
|t Applications --
|g 17.4.1.
|t Case Study 1 --
|g 17.4.2.
|t Case Study 2 --
|g 17.5.
|t Discussion --
|t References --
|g 18.
|t Conclusion --
|g 18.1.
|t Introduction --
|g 18.2.
|t Summary and Conclusion --
|t Appendix Machinery Vibration Data Resources and Analysis Algorithms --
|t References.
|
533 |
|
|
|a Electronic reproduction.
|b Ann Arbor, MI
|n Available via World Wide Web.
|
710 |
2 |
|
|a ProQuest (Firm)
|
856 |
4 |
0 |
|u https://ebookcentral.proquest.com/lib/santaclara/detail.action?docID=5964413
|z Connect to this title online (unlimited simultaneous users allowed; 325 uses per year)
|t 0
|
907 |
|
|
|a .b35976111
|b 200401
|c 200224
|
998 |
|
|
|a uww
|b
|c m
|d z
|e l
|f eng
|g xx
|h 0
|
917 |
|
|
|a YBP DDA
|
919 |
|
|
|a .ulebk
|b 2017-02-14
|
999 |
f |
f |
|i 6b33e831-0d52-5e9d-9f52-939871617649
|s dbce943b-24cd-5028-a3ae-510d324cf6b3
|t 0
|