Minghua Xie's research while affiliated with Changsha Medical University and other places

Publications (11)

Article
Full-text available
This paper presents a piezoelectric (PE) energy harvesting circuit based on the DSSH (double synchronized switch harvesting) principle. The circuit consisted of a rectifier and a DC–DC circuit, which achieves double synchronized switch operation for the PE transducer in each vibration half-cycle. One of the main challenges of the DSSH scheme was pr...
Article
Full-text available
In this paper, the problem of the robust tracking for two-arm condenser cleaning crawler-type mobile manipulators (CCCMM) with delayed angle-velocity uncertainties is original investigated. The two-arm condenser cleaning crawler-type mobile manipulators are composed of a crawler-type mobile platform and two-arm industrial manipulators.The uncertain...
Article
Full-text available
Support vector regression (SVR) is a powerful kernel-based method which has been successfully applied in regression problems. Regarding the feature-weighted SVR algorithms, its contribution to model output has been taken into account. However, the performance of the model is subject to the feature weights and the time consumption on training. In th...
Article
Full-text available
This paper presents a piezoelectric (PE) energy harvesting circuit, which integrates a Synchronized Switch Harvesting on Inductor (SSHI) circuit and a diode bridge rectifier. A typical SSHI circuit cannot transfer the power from a PE cantilever into the load when the rectified voltage is higher than a certain voltage. The proposed circuit addresses...
Article
Full-text available
A two-stage scheduling robust predictive control (RPC) algorithm, which is based on the time-varying coefficient information of the state-dependent ARX (SD-ARX) model, is designed for the output tracking control of a class of nonlinear systems. First, by using the parameter variation range information of the SD-ARX, a strategy for constructing the...
Article
Full-text available
Inverse system method is effective for controlling permanent magnet synchronous motor (PMSM). A novel support vector regression(SVR) inverse system method is proposed to realize decoupling control of PMSM in the article. Firstly, kernel space feature of the inverse model is optimized by using the prior information provided by the mathematical model...
Article
Full-text available
Support Vector Regression (SVR), which converts the original low-dimensional problem to a high-dimensional kernel space linear problem by introducing kernel functions, has been successfully applied in system modeling. Regarding the classical SVR algorithm, the value of the features has been taken into account, while its contribution to the model ou...
Article
Support vector regression (SVR) has unique advantages in system modeling because of the structural risk minimization principle. However, its model generalization ability is subject to excitation signals and kernel parameters. An SVR modeling method excited by sinusoidal signals based on kernel space features is proposed to improve the generalizatio...

Citations

... Support Vector Regression (SVR) is a generalization of Support Vector Machine (SVM) that incorporates regression functions into SVM to solve regression problems [39,40]. As a supervised machine learning algorithm, SVR has a high capability in regression modeling [41,42]. SVR is a kernel-based technique in which the kernel function projects the input data into higher-dimensional feature space to find the hyperplane with the lowest error margin and the best fit to the regression line [43,44]. ...
... The maximum harvested power of the MIS-SSHI circuit can reach 3.7 times that of SEH. Wu et al. [27] also presented a multi-input interface circuit using a single inductor, which integrates a series SSHI circuit and a voltage doubler. It can harvest 3 times as much power from two PZTs as a SEH circuit. ...
... However, its efficiency is lower due to the discarded capacitor charge. In contrast, the synchronized switch harvesting on inductor (SSHI), such as the P-SSHI [4], S-SSHI [5], Hybrid SSHI [6], or Triple bias-flip SSHI [7], adopts an LC resonator with an external inductor. The LC resonator in the SSHI scheme changes the polarity of the capacitor charge at , flipping the capacitor voltage from positive to negative [8][9][10][11][12][13][14]. ...
... In the decoupling strategy of the NNIS, the key is the design and construction of the neural network, but the relevant research and literature have not been discussed too much. A typical error in the back-propagation feed-forward neural network (BPNN) is selected in many documents to identify the inverse system (Bu et al., 2019b;Xie and Xie, 2020). There is no detailed description on how to select the parameters and algorithm in the BPNN. ...
... For this part of the experiments aimed at investigating the importance of the features, as it has been advised [7], we applied linear models, support vector machines with linear kernel and simple linear regression models with additional regularization. All the planned experiments were conducted under a recommended cross-validation regime [26], which allows for credible performance analysis. To evaluate the tested models, we employed several widely adopted metrics in the following experiments: the coefficient of determination (denoted as R 2 ), which is commonly used to compare the performance of different models [19], [21], the mean square error (MSE) or its' square root [20], the mean absolute percentage error (MAPE) [21] which should provide a good intuition on the relative scale average model's prediction error. ...
... Moreover, referring to Zhou et al., 25 one can convert the input and input increment constraints imposed in (36) into the LMIs (29)(30). ...
... It makes the feature importance better matched with its influence on the kernel space. Our previous work [10] has analyzed the necessity of feature weighting and verified it by using the grid search (GS) method to select the optimal combination of weights. Grid search is an exhaustive search method because its computational cost is very high. ...
... At the heart of SVR is mapping data to a higher-dimensional feature space, wherein a linear optimal hyperplane is sought. To realize this aim, SVR harnesses the kernel trick to implicitly compute the dot product in the feature space within the original data domain, sidestepping computational intricacies [40]. The choice of kernel function is pivotal to SVR's efficacy, with prevalent kernels encompassing linear, polynomial, and radial basis function kernels. ...