Loading

Error: Cannot Load Popup Box

Hit List

Title:

Efficient construction of reversible jump Markov chain Monte Carlo proposal distributions

Author:

Description:

The major implementational problem for reversible jump Markov chain Monte Carlo methods is that there is commonly no natural way to choose jump proposals since there is no Euclidean structure in the parameter space to guide our choice. We consider mechanisms for guiding the choice of proposal. The first group of methods is based on an analysis o...

The major implementational problem for reversible jump Markov chain Monte Carlo methods is that there is commonly no natural way to choose jump proposals since there is no Euclidean structure in the parameter space to guide our choice. We consider mechanisms for guiding the choice of proposal. The first group of methods is based on an analysis of acceptance probabilities for jumps. Essentially, these methods involve a Taylor series expansion of the acceptance probability around certain canonical jumps and turn out to have close connections to Langevin algorithms. The second group of methods generalizes the reversible jump algorithm by using the so-called saturated space approach. These allow the chain to retain some degree of memory so that, when proposing to move from a smaller to a larger model, information is borrowed from the last time that the reverse move was performed. The main motivation for this paper is that, in complex problems, the probability that the Markov chain moves between such spaces may be prohibitively small, as the probability mass can be very thinly spread across the space. Therefore, finding reasonable jump proposals becomes extremely important. We illustrate the procedure by using several examples of reversible jump Markov chain Monte Carlo applications including the analysis of autoregressive time series, graphical Gaussian modelling and mixture modelling. Copyright 2003 Royal Statistical Society. Minimize

Document Type:

article

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

A CTAB based method for the preparation of total protein extract of wine spoilage microrganisms for proteomic analysis.

Author:

Description:

Mapping the proteome of microrganisms by 2D-electrophoresis is often a hard task, because many contaminants, e.g. polysaccharides of the cell wall and nucleic acid, can obstruct the pores of the IEF gel resulting in streaks and smears. A protocol based on the use of the cationic detergent cetyl-trimethylammonium bromide (CTAB) and its salt-depen...

Mapping the proteome of microrganisms by 2D-electrophoresis is often a hard task, because many contaminants, e.g. polysaccharides of the cell wall and nucleic acid, can obstruct the pores of the IEF gel resulting in streaks and smears. A protocol based on the use of the cationic detergent cetyl-trimethylammonium bromide (CTAB) and its salt-dependent solubility was developed. The cellulose-producing strain Gluconoacetobacter hansenii AAB0248 was resolved on 7cm Minigels in over 500 protein spots (a hundred more than with protocols reported in literature). The method was further employed for mapping the proteome of some acid adapted, wine spoilage microrganisms e.g. acetic acid bacteria and a yeast. Minimize

Year of Publication:

2009

Document Type:

info:eu-repo/UGOV/Articolo su Rivista

Language:

eng

Subjects:

[PUBMED]:19249253 ; [MINISTERO]:Articolo su rivista ; [UGOV]:Articolo su Rivista ; [UGOV_AUX]:Biochemistry & Biophysics

[PUBMED]:19249253 ; [MINISTERO]:Articolo su rivista ; [UGOV]:Articolo su Rivista ; [UGOV_AUX]:Biochemistry & Biophysics Minimize

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Nonparametric Convergence Assessment for MCMC Model Selection

Author:

Description:

In this paper, we consider the problem of assessing the performance of MCMC model selection algorithms, using a variety of nonparametric techniques. We consider a wide range of model selection problems to which MCMC model selection may be applied and propose several distance measures which can be used to quantify the similarity between multiple ...

In this paper, we consider the problem of assessing the performance of MCMC model selection algorithms, using a variety of nonparametric techniques. We consider a wide range of model selection problems to which MCMC model selection may be applied and propose several distance measures which can be used to quantify the similarity between multiple replications. These measures may be used to assess convergence by examining how "close" these replications of the chain are, since if all chains are at stationarity then this distance should be small. We illustrate our approaches with several practical examples. Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2012-01-17

Source:

http://www.statslab.cam.ac.uk/~steve/mypapers/brogp01.ps

http://www.statslab.cam.ac.uk/~steve/mypapers/brogp01.ps Minimize

Document Type:

text

Language:

en

Subjects:

Variable selection ; graphical models ; mixture models ; autoregressive time series ; chi-squared ; Kolmogorov-Smirnov ; reversible jump MCMC ; birth-death processes

Variable selection ; graphical models ; mixture models ; autoregressive time series ; chi-squared ; Kolmogorov-Smirnov ; reversible jump MCMC ; birth-death processes Minimize

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

Metadata may be used without restrictions as long as the oai identifier remains attached to it. Minimize

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Efficient construction of reversible jump MCMC proposal distributions

Author:

Description:

this paper we consider mechanisms for guiding the proposal choice. The first group of methods is based upon an analysis of acceptance probabilities for jumps. Essentially, these methods involve a Taylor series expansion of the acceptance probability around certain canonical jumps, and turns out to have close connections to Langevin algorithms. T...

this paper we consider mechanisms for guiding the proposal choice. The first group of methods is based upon an analysis of acceptance probabilities for jumps. Essentially, these methods involve a Taylor series expansion of the acceptance probability around certain canonical jumps, and turns out to have close connections to Langevin algorithms. The second group of methods generalises the reversible jump algorithm using the so-called dual space approach. These allow the chain to retain some degree of memory so that when proposing to move from a smaller to a larger model, information is borrowed from the last time that the reverse move performed. Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2009-04-14

Source:

http://www.statslab.cam.ac.uk/~steve/mypapers/brogr00.ps

http://www.statslab.cam.ac.uk/~steve/mypapers/brogr00.ps Minimize

Document Type:

text

Language:

en

Subjects:

Bayesian model selection ; Langevin algorithms ; Optimal scaling ; Autoregressive time series ; Mixture modelling ; Graphical models

Bayesian model selection ; Langevin algorithms ; Optimal scaling ; Autoregressive time series ; Mixture modelling ; Graphical models Minimize

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

Metadata may be used without restrictions as long as the oai identifier remains attached to it. Minimize

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Abstract Physica A 382 (2007) 22–28 Bayesian Networks for enterprise risk assessment

Author:

Description:

According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. Risk, in general, is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (e...

According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. Risk, in general, is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (either qualitative or quantitative data) are used. Moreover, qualitative data must be converted in numerical values or bounds to be used in the model. In the case of enterprise risk assessment the considered risks are, for instance, strategic, operational, legal and of image, which many times are difficult to be quantified. So in most cases only expert data, gathered by scorecard approaches, are available for risk analysis. The Bayesian Networks (BNs) are a useful tool to integrate different information and in particular to study the risk’s joint distribution by using data collected from experts. In this paper we want to show a possible approach for building a BN in the particular case in which only prior probabilities of node states and marginal correlations between nodes are available, and when the variables have only two states. r 2007 Elsevier B.V. All rights reserved. Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2008-07-01

Source:

http://www2.polito.it/eventi/apfa5/Proceedings/Physica A 382 2007/Bonafede.pdf

http://www2.polito.it/eventi/apfa5/Proceedings/Physica A 382 2007/Bonafede.pdf Minimize

Document Type:

text

Language:

en

Subjects:

Bayesian Networks ; Enterprise risk assessment ; Mutual information

Bayesian Networks ; Enterprise risk assessment ; Mutual information Minimize

DDC:

310 Collections of general statistics *(computed)*

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

Metadata may be used without restrictions as long as the oai identifier remains attached to it. Minimize

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Dual energy CT-based characterization of x-ray attenuation properties of breast equivalent material plates

Author:

Description:

Breast cancer is the second cause of cancer and first cause of mortality from cancer for women [1]. Breast cancer screening programs, by means of mammographic imaging, are set up to detect breast cancer at an early stage to improve treatment effectiveness [2]. Breast density has been found an important inherent risk factor for breast cancer [3]....

Breast cancer is the second cause of cancer and first cause of mortality from cancer for women [1]. Breast cancer screening programs, by means of mammographic imaging, are set up to detect breast cancer at an early stage to improve treatment effectiveness [2]. Breast density has been found an important inherent risk factor for breast cancer [3]. This was mainly based on surfacic breast density (ratio of surface of fibroglandular tissue in the image to total surface of the breast in the image). In order to improve accuracy of density estimations, methods have been proposed to estimate volumetric breast density (ratio of volume of the fibroglandular tissue to the total volume of the breast) from mammographic images. Those methods [4] , [5] , [6] have been based on calibrations using a breast equivalent material available from a single manufacturer following the method of White [7] and Fatouros [8] (CIRS Inc., Norfolk, VA). Consequently this breast equivalent material is the cornerstone for estimating breast densities from mammographic X-ray imaging. A fundamental requirement for the phantom material is the close match in attenuation properties between phantom material and real breast tissue in the energy domain of mammography (i.e. 15 keV to 30 keV). The material should in addition allow conhal-00843365 Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2013-10-29

Source:

http://hal.archives-ouvertes.fr/docs/00/84/33/65/PDF/Dual_energy_CT_characterization_of_breast_equivalent_material_SPIE2012_Geeraert_al.pdf

http://hal.archives-ouvertes.fr/docs/00/84/33/65/PDF/Dual_energy_CT_characterization_of_breast_equivalent_material_SPIE2012_Geeraert_al.pdf Minimize

Document Type:

text

Language:

en

DDC:

616 Diseases *(computed)*

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Decomposable graphical Gaussian model determination

Author:

Description:

We propose a methodology for Bayesian model determination in decomposable graphical Gaussian models. To achieve this aim we consider a hyper inverse Wishart prior distribution on the concentration matrix for each given graph. To ensure compatibility across models, such prior distributions are obtained by marginalisation from the prior conditiona...

We propose a methodology for Bayesian model determination in decomposable graphical Gaussian models. To achieve this aim we consider a hyper inverse Wishart prior distribution on the concentration matrix for each given graph. To ensure compatibility across models, such prior distributions are obtained by marginalisation from the prior conditional on the complete graph. We explore alternative structures for the hyperparameters of the latter, and their consequences for the model. Model determination is carried out by implementing a reversible jump Markov chain Monte Carlo sampler. In particular, the dimension-changing move we propose involves adding or dropping an edge from the graph. We characterise the set of moves which preserve the decomposability of the graph, giving a fast algorithm for maintaining the junction tree representation of the graph at each sweep. As state variable, we use the incomplete variance-covariance matrix, containing only the elements for which the corresponding element of the inverse is nonzero. This allows all computations to be performed locally, at the clique level, which is a clear advantage for the analysis of large and complex datasets. Finally, the statistical and computational performance of the procedure is illustrated by mean of both artificial and real datasets. Minimize

Publisher:

Oxford University Press

Year of Publication:

1999-12-01 00:00:00.0

Document Type:

TEXT

Language:

en

Subjects:

Articles

Articles Minimize

DDC:

519 Probabilities & applied mathematics *(computed)*

Rights:

Copyright (C) 1999, Biometrika Trust

Copyright (C) 1999, Biometrika Trust Minimize

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Bayesian networks for enterprise risk assessment

Description:

According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. In general risk is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (eit...

According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. In general risk is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (either qualitative or quantitative data) are used. Moreover qualitative data must be converted in numerical values to be used in the model. In the case of enterprise risk assessment the considered risks are, for instance, strategic, operational, legal and of image, which many times are difficult to be quantified. So in most cases only expert data, gathered by scorecard approaches, are available for risk analysis. The Bayesian Network is a useful tool to integrate different information and in particular to study the risk's joint distribution by using data collected from experts. In this paper we want to show a possible approach for building a Bayesian networks in the particular case in which only prior probabilities of node states and marginal correlations between nodes are available, and when the variables have only two states. Minimize

Year of Publication:

2006-07-25

Document Type:

text

Subjects:

Physics - Physics and Society

Physics - Physics and Society Minimize

DDC:

310 Collections of general statistics *(computed)*

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Bayesian feature selection to estimate customer survival

Author:

Description:

We consider the problem of estimating the lifetime value of customers, when a large number of features are present in the data. In order to measure lifetime value we use survival analysis models to estimate customer tenure. In such a context, a number of classical modelling challenges arise. We will show how our proposed Bayesian methods perform...

We consider the problem of estimating the lifetime value of customers, when a large number of features are present in the data. In order to measure lifetime value we use survival analysis models to estimate customer tenure. In such a context, a number of classical modelling challenges arise. We will show how our proposed Bayesian methods perform, and compare it with classical churn models on a real case study. More specifically, based on data from a media service company, our aim will be to predict churn behaviour, in order to entertain appropriate retention actions. Minimize

Publisher:

Università degli Studi di Pavia, Dipartimento di Economia Politica e Metodi Quantitativi (EPMQ) Pavia

Year of Publication:

2006

Document Type:

doc-type:workingPaper

Language:

eng

Subjects:

ddc:330 ; Statistische Methode ; Bayes-Statistik ; Verbraucher

ddc:330 ; Statistische Methode ; Bayes-Statistik ; Verbraucher Minimize

DDC:

Rights:

http://www.econstor.eu/dspace/Nutzungsbedingungen

http://www.econstor.eu/dspace/Nutzungsbedingungen Minimize

Relations:

Quaderni di Dipartimento, EPMQ, Università degli Studi di Pavia 185

Content Provider:

My Lists:

My Tags:

Notes:

Title:

On the Gini measure decomposition

Description:

The purpose of this research is to introduce a new approach to the decomposition of the Gini measure in terms of concordance and discordance shares: a new kind of dependence, the Gini rank dependence (GRD), and its formal definition are provided. ; Gini measure Concordance curve Gini rank dependence

The purpose of this research is to introduce a new approach to the decomposition of the Gini measure in terms of concordance and discordance shares: a new kind of dependence, the Gini rank dependence (GRD), and its formal definition are provided. ; Gini measure Concordance curve Gini rank dependence Minimize

Document Type:

article

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Currently in BASE: 68,072,316 Documents of 3,307 Content Sources

http://www.base-search.net