--- title: "Introduction to BLiSS method" author: "Paul-Marie Grollemund" date: "`r Sys.Date()`" output: html_document: toc: yes number_sections: true vignette: > %\VignetteIndexEntry{Introduction to BliSS method} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- ```{r setup, include = FALSE} knitr::opts_chunk$set( collapse = TRUE, comment = "#>" ) ``` ```{r,echo=TRUE,message=FALSE, warning=FALSE} library(bliss) ``` This vignette describes step by step how to use the BLiSS method. Below, you can find the following implemented features: * Simulate data to test the BLiSS model * Obtain a sample of the posterior distribution with a Gibbs Sampler * Plot the posterior distribution of the coefficient function and the posterior distribution of the support * Compute the different Bayesian estimators # One single functional covariate case ## Simulate a data set In order to simulate a proper dataset for Bliss application, some characteristics must be specified: * $n$ (the number of observations), * $p$ (number of instant of measure), * beta$\_$types (the shape of the coefficient function), and * grids$\_$lim, a 2-vector (to define the domain of curves $x_{i}(.)$). Based on these parameters, data can be simulated (curves $x_{i}(.)$ and real values $y_{i}$) from the functional linear regression model by using the *sim* function, as suggested in the following chunck. ```{r ,eval=TRUE,include = TRUE} set.seed(1) param <- list( # define the "param" to simulate data Q=1, # the number of functional covariate n=50, # n is the sample size and p is the p=c(20), # number of time observations of the curves beta_types=c("smooth"), # define the shape of the "true" coefficient function grids_lim=list(c(0,1))) # Give the beginning and the end of the observation's domain of the functions. data <- sim(param) # Simulate the data ``` ## How to apply the Bliss method In order to apply the Bliss method, the main function to use is *fit$\_$Bliss*. This function provides the following outputs: * a posterior sample of the Bliss model, * an approximation of the posterior distribution of the coefficient function, * a piecewise constant estimate (stepfunction) of the coefficient function, which is computed thanks to an optimization algorithm, * an estimation of the support, which shoulb be useful if your purpose is to detect periods for which the functional coviarate has an (linear) impact on the dependent scalar variable, * the posterior densities of the posterior sample, which should be used to compute model choice criterion. An important required argument of the previous function is **param**, which is a list containing: * **iter**, the number of iterations for the Gibbs algorithm, * **burnin** the number of iteration to drop at the beginning of the Gibbs Sampler, * **K**, hyperparameter $K$ of the Bliss model, * **grids**, the grid of instant for which the curves $x_{i}(.)$ are measured, * **prior$\_$beta**, an argument specifying a prior distribution of the slope coefficient $\beta$, (only the **Ridge$\_$Zellner** case is considered in this vignette), * and **phi$\_$l**, an argument specifying a prior distribution for $\ell$ the half-width of the intervals (only the **Gamma`** case is considered in this vignette), Find below, an example of use of this function and a sketch of the structure of the returned object. ```{r ,eval=TRUE, include = TRUE} param <- list( # define the required values of the Bliss method. iter=5e2, # The number of iteration of the main numerical algorithm of Bliss. burnin=2e2, # The number of burnin iteration for the Gibbs Sampler K=c(3)) # The number of intervals of the beta res_bliss<-fit_Bliss(data=data,param=param) # Structure of a Bliss object # str(res_bliss) ``` ## Graphical results This section presents how to obtain main graphical results (posterior quantities) derived from the Bliss method. ### Coefficient function Considering Functional Linear Regression model (FLR), and the specific *scalar-on-function* case, the major model parameter to infer is the coefficient function $\beta(.)$. The following chunck shows how to plot the posterior distribution of the coefficient function: ```{r ,eval=FALSE, include = TRUE,fig.height=5,fig.width=7} library(ggplot2) image_Bliss(res_bliss$beta_posterior_density,param,q=1) ``` Additionnaly to this plot, one could usually want to display a point estimate of the coefficient function (which is a function). By using the following code, you can access to: * **Bliss estimate**, a piecewise constant version of the coefficient function, and * **the smooth estimate**, the standard bayesian estimate of the coefficient function (standard means that it minimizes the posterior $L^2$-loss). ```{r ,eval=FALSE, include = TRUE,fig.height=5,fig.width=7} image_Bliss(res_bliss$beta_posterior_density,param,q=1) + lines_bliss(res_bliss$data$grids[[1]],res_bliss$Bliss_estimate[[1]]) + lines_bliss(res_bliss$data$grids[[1]],res_bliss$smooth_estimate[[1]],lty = "dashed")+ lines_bliss(res_bliss$data$grids[[1]],data$betas[[1]],col="purple") ``` The solid black line is Bliss estimate, the dashed black line is the smooth estimate and the solid purple line is the true coefficien function. ### Support of coefficient function According to the scientific problematic, one could aim to infer the coefficient function, but it is possible to alternatively focus only on the support of the coefficient function. In this case, the sign and the magniture of the coefficient function could be considered as nuisance parameters. Therefore, the Bliss method provides a specific estimation procedure for the support of the coefficient function (which relies on the posterior distribution of the coefficient function). It consists in deriving the posterior probabilities $\alpha(t|D)$, for each $t$ in the domain $\mathcal T$ of the functional data, which correspond to the probabilities (conditionnaly to the observed data) that the support of the coefficient function covers the time $t$. To plot the posterior probabilities, you have to use the following code : ```{r ,eval=TRUE, include = TRUE,fig.height=5,fig.width=7} plot(res_bliss$alpha[[1]],type="o",xlab="time",ylab="posterior probabilities") ``` From these posterior probabilities, the support estimate is derived by thresholding the probabilities. Without prior information guiding the estimation procedure, the default threshold is 0.5. The estimate support is then defined as the collection of time $t$ for which the posterior probability $\alpha(t|D) > 0.5$. ```{r ,eval=TRUE, include = TRUE,fig.height=5,fig.width=7} plot(res_bliss$alpha[[1]],type="o",xlab="time",ylab="posterior probabilities") abline(h=0.5,col=2,lty=2) for(i in 1:nrow(res_bliss$support_estimate[[1]])){ segments(res_bliss$support_estimate[[1]]$begin[i],0.05, res_bliss$support_estimate[[1]]$end[i],0.05,col="red" ) points(res_bliss$support_estimate[[1]]$begin[i],0.05,col="red",pch="|",lwd=2) points(res_bliss$support_estimate[[1]]$end[i],0.05,col="red",pch="|",lwd=2) } ``` A resume of the support estimate is provided with: ```{r ,eval=TRUE, include = TRUE,fig.height=5,fig.width=7} res_bliss$support_estimate[[1]] ``` # Multiple functional covariates To avoid unnecesseray computational time, this section is not executed. You could figure out that the functions, objects and procedures are mostly similar to the previous one (single functional covariate case). The main differences are that: * number of function covariates have to be specified in the `param` object, and * posterior quantities are available in elements of output lists. ## Simulate a data set ```{r ,eval=FALSE, include = TRUE} param <- list(Q=2, n=50, p=c(40,10), beta_shapes=c("simple","smooth"), grids_lim=list(c(0,1),c(0,2))) data <- sim(param) ``` ## How to apply the Bliss method ```{r ,eval=FALSE, include = TRUE} param <- list( # define the required values of the Bliss method. iter=1e3, # The number of iteration of the main numerical algorithm of Bliss. burnin=2e2, # The number of burnin iteration for the Gibbs Sampler K=c(3,3)) # The number of intervals of the beta res_Bliss_mult <- fit_Bliss(data=data,param=param) ``` ## Graphical results ```{r ,eval=FALSE, include = TRUE,fig.height=5,fig.width=7} image_Bliss(res_Bliss_mult$beta_posterior_density,param,q=1) + lines_bliss(res_Bliss_mult$data$grids[[1]],res_Bliss_mult$Bliss_estimate[[1]]) + lines_bliss(res_Bliss_mult$data$grids[[1]],res_Bliss_mult$smooth_estimate[[1]],lty = "dashed")+ lines_bliss(res_Bliss_mult$data$grids[[1]],data$betas[[1]],col="purple") image_Bliss(res_Bliss_mult$beta_posterior_density,param,q=2) + lines_bliss(res_Bliss_mult$data$grids[[2]],res_Bliss_mult$Bliss_estimate[[2]]) + lines_bliss(res_Bliss_mult$data$grids[[2]],res_Bliss_mult$smooth_estimate[[2]],lty = "dashed")+ lines_bliss(res_Bliss_mult$data$grids[[2]],data$betas[[2]],col="purple") ``` # Session informations ```{r session,echo=FALSE,message=FALSE, warning=FALSE} sessionInfo() ```