Application of Variational Mode Decomposition Based Different Machine Learning Models
The VMDML R package is designed for application of Variational Mode Decomposition based different Machine Learning models for univariate time series forecasting. This package provides five different function i.e. VMDARIMA, VMDELM, VMDRF, VMDSVR and VMDTDNN. It also provide you with accuracy measures along with an option to select the proportion of training and testing data sets. Users can choose among the available choices of parameters of Variational Mode Decomposition for fitting the ML Models. In this package we have modelled the dependency of the study variable assuming first order autocorrelation. This package will help the researchers working in the area of hybrid machine learning models.
VMDARIMA- The VMDARIMA function helps to fit the Variational Mode Decomposition Based Autoregressive Moving Average Model.
VMDELM- The VMDELM function helps to fit the Variational Mode Decomposition based Extreme Learning Machine Model.
VMDRF- The VMDRF function helps to fit the Variational Mode Decomposition based Random Forest Model.
VMDSVR- The VMDSVR function helps to fit the Variational Mode Decomposition based Support Vector Regression Model.
VMDTDNN- The VMDTDNN function helps to fit the Variational Mode Decomposition based Time Delay Neural Network Model.
Variational mode decomposition (VMD) is one of the latest signal decomposition techniques, similar to EMD, first proposed by Dragomiretskiy and Zosso (2014). This is a an entirely non-recursive variational mode decomposition model,where the modes are extracted concurrently. The algorithm generates an ensemble of modes and their respective center frequencies, such that the modes collectively reproduce the input signal. Further one of Machine learning models like SVR, TDNN, RF etcapplied to each decomposed items to forecast them. Finally all forecasted values are aggregated to produce final forecast value (Das et al., 2019, 2020, 2022, 2023).
Dragomiretskiy, K. and Zosso, D.(2014). Variational Mode Decomposition. IEEE Transactions on Signal Processing, 62(3):531-544.(doi: 10.1109/TSP.2013.2288675).
Das,P., Jha,G. K., Lama, A., Parsad, R. and Mishra, D. (2020). Empirical Mode Decomposition based Support Vector Regression for Agricultural Price Forecasting. Indian Journal of Extension Education, 56(2):7-12.(http://krishi.icar.gov.in/jspui/handle/123456789/44138).
Das, P. Jha, G. K. and Lama, A. (2023). Empirical Mode Decomposition Based Ensemble Hybrid Machine Learning Models for Agricultural Commodity Price Forecasting. Statistics and Applications, 21(1),99-112.(http://krishi.icar.gov.in/jspui/handle/123456789/77772).
Das, P., Jha, G. K., Lama, A. and Bharti (2022). EMD-SVR Hybrid Machine Learning Model and its Application in Agricultural Price Forecasting. Bhartiya Krishi Anusandhan Patrika. (DOI: 10.18805/BKAP385)
Das, P. (2019). Study On Machine Learning Techniques Based Hybrid Model for Forecasting in Agriculture. Published Ph.D. Thesis.
Choudhury, K., Jha, G. K., Das, P. and Chaturvedi, K. K. (2019). Forecasting Potato Price using Ensemble Artificial Neural Networks. Indian Journal of Extension Education, 55(1):71-77.(http://krishi.icar.gov.in/jspui/handle/123456789/44873).
Das, P., Lama, A. and Jha, G. K. (2022). Variational Mode Decomposition based Machine Learning Models Optimized with Genetic Algorithm for Price Forecasting. Journal of the Indian Society of Agricultural Statistics, 76(3), 141-150. (http://krishi.icar.gov.in/jspui/handle/123456789/76648)
##Example how the package works
library(VMDML)
#Application
# A Random time series dataset generation
set.seed(6)
data3 <- rnorm(300,6.6,.36)
#Parameter setting
alpha = 2000
tau = 0
K = 3
k=0.8
DC = FALSE
init = 1
tol = 1e-6
#Application of VMDARIMA model
VMDARIMA(data3,.8,alpha,tau,K,DC,init,tol)
#> Registered S3 method overwritten by 'quantmod':
#> method from
#> as.zoo.data.frame zoo
#> $Total_No_IMF
#> [1] 3
#>
#> $Prediction_Accuracy_VMDARIMA
#> RMSE_out MAD_out MAPE_out ME_out
#> [1,] 0.3623889 0.2843151 0.0435229 0.8769506
#>
#> $Final_Prediction_VMDARIMA
#> Time Series:
#> Start = 1
#> End = 60
#> Frequency = 1
#> [1] 6.385717 6.517013 6.560424 6.649272 6.625478 6.550825 6.467676 6.452364
#> [9] 6.509343 6.590927 6.628264 6.593008 6.516362 6.462356 6.473832 6.538703
#> [17] 6.601909 6.612043 6.562997 6.497087 6.468619 6.499209 6.561528 6.603554
#> [25] 6.591915 6.538280 6.488016 6.481885 6.523062 6.576097 6.597344 6.570817
#> [33] 6.520063 6.487378 6.498617 6.542831 6.582688 6.585932 6.551426 6.508799
#> [41] 6.493073 6.515940 6.557208 6.582540 6.571975 6.535585 6.504041 6.502815
#> [49] 6.531659 6.565867 6.577382 6.557765 6.524257 6.504696 6.514427 6.544329
#> [57] 6.569242 6.569096 6.545040 6.517630
#Application of VMDELM model
#VMDELM(data3,0.8,alpha,tau,K,DC,init,tol)
#Parameter setting for RF model
m = 3
n =5
#Application of VMDRF model
VMDRF(data3,k,alpha,tau,K,DC,init,tol,m,n)
#> Warning in randomForest.default(m, y, ...): invalid mtry: reset to within valid
#> range
#>
#> Call:
#> randomForest(formula = yt ~ ., data = traindata, mtry = m, ntree = n)
#> Type of random forest: regression
#> Number of trees: 5
#> No. of variables tried at each split: 1
#>
#> Mean of squared residuals: 7.736168e-05
#> % Var explained: 96.34
#> Warning in randomForest.default(m, y, ...): invalid mtry: reset to within valid
#> range
#>
#> Call:
#> randomForest(formula = yt ~ ., data = traindata, mtry = m, ntree = n)
#> Type of random forest: regression
#> Number of trees: 5
#> No. of variables tried at each split: 1
#>
#> Mean of squared residuals: 0.01355887
#> % Var explained: -18.04
#> Warning in randomForest.default(m, y, ...): invalid mtry: reset to within valid
#> range
#>
#> Call:
#> randomForest(formula = yt ~ ., data = traindata, mtry = m, ntree = n)
#> Type of random forest: regression
#> Number of trees: 5
#> No. of variables tried at each split: 1
#>
#> Mean of squared residuals: 0.01425294
#> % Var explained: -9.06
#> $Total_No_IMF
#> [1] 3
#>
#> $Prediction_Accuracy_VMDRF
#> RMSE_out MAD_out MAPE_out ME_out
#> [1,] 0.349558 0.2730837 0.04138256 0.9957001
#>
#> $Final_Prediction_VMDRF
#> [1] 6.376167 6.465836 6.702820 6.479088 6.552101 6.535193 6.556291 6.416575
#> [9] 6.582246 6.475146 6.407093 6.551189 6.338622 6.520634 6.831918 6.644199
#> [17] 6.689642 6.708327 6.359298 6.443065 6.514585 6.810095 6.563176 6.624794
#> [25] 6.650428 6.381170 6.578740 6.734643 6.644227 6.642471 6.392661 6.625165
#> [33] 6.435068 6.435709 6.622126 6.748587 6.561443 6.610452 6.483202 6.385544
#> [41] 6.484079 6.513913 6.707880 6.630965 6.406792 6.448278 6.638272 6.544460
#> [49] 6.545139 6.640979 6.403684 6.350179 6.429551 6.449874 6.680484 6.599530
#> [57] 6.520443 6.514219 6.387496
#Application of VMDSVR model
VMDSVR(data3,.8,alpha,tau,K,DC,init,tol,"radial","nu-regression")
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = ker.funct, type = svm.type)
#>
#>
#> Parameters:
#> SVM-Type: nu-regression
#> SVM-Kernel: radial
#> cost: 1
#> nu: 0.5
#>
#> Number of Support Vectors: 124
#>
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = ker.funct, type = svm.type)
#>
#>
#> Parameters:
#> SVM-Type: nu-regression
#> SVM-Kernel: radial
#> cost: 1
#> nu: 0.5
#>
#> Number of Support Vectors: 123
#>
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = ker.funct, type = svm.type)
#>
#>
#> Parameters:
#> SVM-Type: nu-regression
#> SVM-Kernel: radial
#> cost: 1
#> nu: 0.5
#>
#> Number of Support Vectors: 124
#> $Total_No_IMF
#> [1] 3
#>
#> $Prediction_Accuracy_VMDSVR
#> RMSE_out MAD_out MAPE_out ME_out
#> [1,] 0.3320138 0.2569197 0.03911157 0.9101023
#>
#> $Final_Prediction_VMDSVR
#> [1] 6.480423 6.536723 6.635920 6.549690 6.511117 6.529582 6.482741 6.575923
#> [9] 6.594332 6.495253 6.502583 6.525639 6.515993 6.603834 6.796880 6.665824
#> [17] 6.596457 6.622729 6.462741 6.502182 6.706677 6.735452 6.675357 6.639533
#> [25] 6.481453 6.478030 6.568227 6.606601 6.702437 6.711362 6.546776 6.544822
#> [33] 6.549525 6.529673 6.705733 6.742729 6.634898 6.570914 6.486457 6.433356
#> [41] 6.592343 6.692235 6.645196 6.656943 6.501565 6.370021 6.470655 6.585128
#> [49] 6.566176 6.616223 6.532478 6.373007 6.420126 6.479545 6.549845 6.612550
#> [57] 6.586957 6.482908 6.421796
#Application of VMDTDNN model
#VMDTDNN(data3,.8,alpha,tau,K,DC,init,tol,1,5,20,100)