Application of Empirical Mode Decomposition based Artificial Neural Network
The EMDSVR R package is designed for application of Empirical Mode Decomposition based Support Vector Regression for univariate time series forecasting. It also provide you with accuracy measures along with an option to select the proportion of training and testing data sets. Users can choose among the available choices of parameters of Empirical Mode Decomposition for fitting the SVR Models. In this package we have modelled the dependency of the study variable assuming first order autocorrelation. This package will help the researchers working in the area of hybrid machine learning models.
EMDSVRhybrid- The EMDSVRhybrid function helps to fit the Empirical Mode Decomposition based Support Vector Regression model.
Empirical mode decomposition (EMD) is one of the latest signal decomposition techniques, first proposed by Huang et al. (1996). It assumes that the data have many coexisting oscillatory modes of significantly distinct frequencies and these modes superimpose on each other and form an observable time series. EMD decomposes original non-stationary and nonlinear data into a finite and small number of independent sub-series (including intrinsic mode functions and a final residue). Further Support Vector Regression model applied to each decomposed items to forecast them. Finally all forecasted values are aggregated to produce final forecast value (Das et al., 2019, 2020, 2022, 2023).
Dragomiretskiy, K. and Zosso, D.(2014). Variational Mode Decomposition. IEEE Transactions on Signal Processing, 62(3):531-544.(doi: 10.1109/TSP.2013.2288675).
Das, P., Jha, G. K., Lama, A., Parsad, R. and Mishra, D. (2020). Empirical Mode Decomposition based Support Vector Regression for Agricultural Price Forecasting. Indian Journal of Extension Education, 56(2):7-12.(http://krishi.icar.gov.in/jspui/handle/123456789/44138).
Das, P. Jha, G. K. and Lama, A. (2023). Empirical Mode Decomposition Based Ensemble Hybrid Machine Learning Models for Agricultural Commodity Price Forecasting. Statistics and Applications, 21(1),99-112.(http://krishi.icar.gov.in/jspui/handle/123456789/77772).
Das, P., Jha, G. K., Lama, A. and Bharti (2022). EMD-SVR Hybrid Machine Learning Model and its Application in Agricultural Price Forecasting. Bhartiya Krishi Anusandhan Patrika. (DOI: 10.18805/BKAP385)
Das, P. (2019). Study On Machine Learning Techniques Based Hybrid Model for Forecasting in Agriculture. Published Ph.D. Thesis.
Choudhury, K., Jha, G. K., Das, P. and Chaturvedi, K. K. (2019). Forecasting Potato Price using Ensemble Artificial Neural Networks. Indian Journal of Extension Education, 55(1):71-77.(http://krishi.icar.gov.in/jspui/handle/123456789/44873).
Das, P., Lama, A. and Jha, G. K. (2022). Variational Mode Decomposition based Machine Learning Models Optimized with Genetic Algorithm for Price Forecasting. Journal of the Indian Society of Agricultural Statistics, 76(3), 141-150. (http://krishi.icar.gov.in/jspui/handle/123456789/76648)
##Example how the package works
library(EMDSVRhybrid)
#> Loading required package: EMD
#> Loading required package: fields
#> Loading required package: spam
#> Spam version 2.11-0 (2024-10-03) is loaded.
#> Type 'help( Spam)' or 'demo( spam)' for a short introduction
#> and overview of this package.
#> Help for individual functions is also obtained by adding the
#> suffix '.spam' to the function name, e.g. 'help( chol.spam)'.
#>
#> Attaching package: 'spam'
#> The following objects are masked from 'package:base':
#>
#> backsolve, forwardsolve
#> Loading required package: viridisLite
#>
#> Try help(fields) to get started.
#> Loading required package: locfit
#> locfit 1.5-9.10 2024-06-24
#> Loading required package: e1071
#Application
# A Random time series dataset generation
set.seed(6)
example_data=rnorm(500,30,5)
#Parameter setting
k <- 0.9
#Application of EMDANN model
EMDSVRhybrid(example_data,0.9,funct="radial")
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = funct)
#>
#>
#> Parameters:
#> SVM-Type: eps-regression
#> SVM-Kernel: radial
#> cost: 1
#> gamma: 1
#> epsilon: 0.1
#>
#>
#> Number of Support Vectors: 383
#>
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = funct)
#>
#>
#> Parameters:
#> SVM-Type: eps-regression
#> SVM-Kernel: radial
#> cost: 1
#> gamma: 1
#> epsilon: 0.1
#>
#>
#> Number of Support Vectors: 415
#>
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = funct)
#>
#>
#> Parameters:
#> SVM-Type: eps-regression
#> SVM-Kernel: radial
#> cost: 1
#> gamma: 1
#> epsilon: 0.1
#>
#>
#> Number of Support Vectors: 390
#>
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = funct)
#>
#>
#> Parameters:
#> SVM-Type: eps-regression
#> SVM-Kernel: radial
#> cost: 1
#> gamma: 1
#> epsilon: 0.1
#>
#>
#> Number of Support Vectors: 339
#>
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = funct)
#>
#>
#> Parameters:
#> SVM-Type: eps-regression
#> SVM-Kernel: radial
#> cost: 1
#> gamma: 1
#> epsilon: 0.1
#>
#>
#> Number of Support Vectors: 232
#>
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = funct)
#>
#>
#> Parameters:
#> SVM-Type: eps-regression
#> SVM-Kernel: radial
#> cost: 1
#> gamma: 1
#> epsilon: 0.1
#>
#>
#> Number of Support Vectors: 127
#>
#>
#> Call:
#> svm(formula = yt ~ ., data = traindata, kernel = funct)
#>
#>
#> Parameters:
#> SVM-Type: eps-regression
#> SVM-Kernel: radial
#> cost: 1
#> gamma: 1
#> epsilon: 0.1
#>
#>
#> Number of Support Vectors: 8
#> $Prediction_Accuracy_EMDSVR
#> RMSE_out MAD_out MAPE_out
#> [1,] 7.500744 6.614184 0.2358237
#>
#> $Final_Prediction_EMDSVR
#> 449 450 451 452 453 454 455 456
#> 32.91201 25.31079 32.20177 24.24060 23.50552 21.88583 26.99437 22.43297
#> 457 458 459 460 461 462 463 464
#> 23.20824 27.00398 31.69754 26.69485 22.50922 31.57307 28.60468 33.27870
#> 465 466 467 468 469 470 471 472
#> 27.38401 30.81163 31.75324 34.08287 33.66007 28.78933 29.34951 32.40041
#> 473 474 475 476 477 478 479 480
#> 25.29133 33.06124 27.51555 23.31895 23.61147 23.81729 30.89904 30.85134
#> 481 482 483 484 485 486 487 488
#> 28.00803 33.28380 32.62908 32.80473 30.30636 35.83409 34.47249 28.85199
#> 489 490 491 492 493 494 495 496
#> 23.84510 31.01968 35.06868 26.88791 33.68487 26.30491 27.81818 34.77597
#> 497
#> 33.48277