Loading

Error: Cannot Load Popup Box

Hit List

Title:

The performance of credit rating systems in the assessment of collateral used in Eurosystem monetary policy operations

Author:

Description:

The aims of this paper are twofold: first, we attempt to express the threshold of a single “A” rating as issued by major international rating agencies in terms of annualised probabilities of default. We use data from Standard & Poor’s and Moody’s publicly available rating histories to construct confidence intervals for the level of probability o...

The aims of this paper are twofold: first, we attempt to express the threshold of a single “A” rating as issued by major international rating agencies in terms of annualised probabilities of default. We use data from Standard & Poor’s and Moody’s publicly available rating histories to construct confidence intervals for the level of probability of default to be associated with the single “A” rating. The focus on the single “A” rating level is not accidental, as this is the credit quality level at which the Eurosystem considers financial assets to be eligible collateral for its monetary policy operations. The second aim is to review various existing validation models for the probability of default which enable the analyst to check the ability of credit assessment systems to forecast future default events. Within this context the paper proposes a simple mechanism for the comparison of the performance of major rating agencies and that of other credit assessment systems, such as the internal ratings-based systems of commercial banks under the Basel II regime. This is done to provide a simple validation yardstick to help in the monitoring of the performance of the different credit assessment systems participating in the assessment of eligible collateral underlying Eurosystem monetary policy operations. Contrary to the widely used confidence interval approach, our proposal, based on an interpretation of p-values as frequencies, guarantees a convergence to an ex ante fixed probability of default (PD) value. Given the general characteristics of the problem considered, we consider this simple mechanism to also be applicable in other contexts. Minimize

Document Type:

preprint

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Managerial behavior and cost/profit efficiency in the banking sectors of Central and Eastern European countries

Author:

Description:

This paper analyzes cost and profit efficiency level and the managerial behavior of banks in nine Central and Eastern European countries (the Czech Republic, Estonia, Hungary, Latvia, Lithuania Poland, Romania, Slovakia and Slovenia), providing cross-country and time series evidence on the period 1995-2002. A stochastic frontier analysis based o...

This paper analyzes cost and profit efficiency level and the managerial behavior of banks in nine Central and Eastern European countries (the Czech Republic, Estonia, Hungary, Latvia, Lithuania Poland, Romania, Slovakia and Slovenia), providing cross-country and time series evidence on the period 1995-2002. A stochastic frontier analysis based on a Fourier flexible form indicates a generally low level of cost efficiency and an even lower level of profit efficiency. However, we also find significant differences among countries and some evidence of an increasing tendency over time in profit efficiency and, to an even stronger extent, in cost efficiency. Cost and profit efficiency scores are negatively correlated both on a country wide as well as on a bank by bank basis. Furthermore, instead of just looking at the determinants of cost and profit efficiency (e.g. asset quality, problem loans and risk), we test several hypotheses of managerial behavior using the Granger causality approach based on the intertemporal relation between bank efficiency, capitalization and problem loans, as proposed by Berger and DeYoung (1997). Even though a static analysis shows a negative correlation between problem loan and efficiency, we find no evidence of bad management hypothesis. Results provide evidence for the bad luck hypothesis suggesting the exogeneity of bad loans triggering inefficiency. JEL classification: G21; G28; C14; D21 ; Cost and profit efficiency; CEECs; Stochastic frontier analysis; Managerial behavior Minimize

Document Type:

preprint

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

The performance of credit rating systems in the assessment of collateral used in Eurosystem monetary policy operations

Author:

Description:

The aims of this paper are twofold: first, we attempt to express the threshold of a single “A” rating as issued by major international rating agencies in terms of annualised probabilities of default. We use data from Standard & Poor’s and Moody’s publicly available rating histories to construct confidence intervals for the level of probability o...

The aims of this paper are twofold: first, we attempt to express the threshold of a single “A” rating as issued by major international rating agencies in terms of annualised probabilities of default. We use data from Standard & Poor’s and Moody’s publicly available rating histories to construct confidence intervals for the level of probability of default to be associated with the single “A” rating. The focus on the single A rating level is not accidental, as this is the credit quality level at which the Eurosystem considers financial assets to be eligible collateral for its monetary policy operations. The second aim is to review various existing validation models for the probability of default which enable the analyst to check the ability of credit assessment systems to forecast future default events. Within this context the paper proposes a simple mechanism for the comparison of the performance of major rating agencies and that of other credit assessment systems, such as the internal ratings-based systems of commercial banks under the Basel II regime. This is done to provide a simple validation yardstick to help in the monitoring of the performance of the different credit assessment systems participating in the assessment of eligible collateral underlying Eurosystem monetary policy operations. Contrary to the widely used confidence interval approach, our proposal, based on an interpretation of p-values as frequencies, guarantees a convergence to an ex ante fixed probability of default (PD) value. Given the general characteristics of the problem considered, we consider this simple mechanism to also be applicable in other contexts. ; credit risk, rating, probability of default (PD), performance checking, backtesting Minimize

Document Type:

preprint

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

65 “The performance of credit rating systems in the assessment of collateral used in Eurosystem monetary policy operations” by

Author:

Description:

In 2007 all ECB publications feature a motif taken from the €20 banknote. OCCASIONAL PAPER SERIES

In 2007 all ECB publications feature a motif taken from the €20 banknote. OCCASIONAL PAPER SERIES Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2009-04-07

Source:

http://www.ecb.int/pub/pdf/scpops/ecbocp65.pdf

http://www.ecb.int/pub/pdf/scpops/ecbocp65.pdf Minimize

Document Type:

text

Language:

en

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

Metadata may be used without restrictions as long as the oai identifier remains attached to it. Minimize

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Complexity L0-Penalized M-Estimation: Consistency in More Dimensions

Author:

Description:

We study the asymptotics in L2 for complexity penalized least squares regression for the discrete approximation of finite-dimensional signals on continuous domains—e.g., images—by piecewise smooth functions. We introduce a fairly general setting, which comprises most of the presently popular partitions of signal or image domains, like interval, ...

We study the asymptotics in L2 for complexity penalized least squares regression for the discrete approximation of finite-dimensional signals on continuous domains—e.g., images—by piecewise smooth functions. We introduce a fairly general setting, which comprises most of the presently popular partitions of signal or image domains, like interval, wedgelet or related partitions, as well as Delaunay triangulations. Then, we prove consistency and derive convergence rates. Finally, we illustrate by way of relevant examples that the abstract results are useful for many applications. Minimize

Publisher:

Multidisciplinary Digital Publishing Institute

Year of Publication:

2013-07-09

Source:

Axioms; Volume 2; Issue 3; Pages 311-344

Axioms; Volume 2; Issue 3; Pages 311-344 Minimize

Document Type:

Text

Language:

EN

Subjects:

adaptive estimation; penalized M-estimation; Potts functional; complexity penalized; variational approach; consistency; convergence rates; wedgelet partitions; Delaunay triangulations

adaptive estimation; penalized M-estimation; Potts functional; complexity penalized; variational approach; consistency; convergence rates; wedgelet partitions; Delaunay triangulations Minimize

Rights:

http://creativecommons.org/licenses/by/3.0/

http://creativecommons.org/licenses/by/3.0/ Minimize

Content Provider:

My Lists:

My Tags:

Notes:

Title:

An Adaptive Gradient Algorithm for Maximum Likelihood Estimation in Imaging: A Tutorial

Description:

Markov random fields serve as natural models for patterns or textures with random fluctuations at small scale. Given a general form of such fields each class of pattern corresponds to a collection of model parameters which critically determines the abilitity of algorithms to segment or classify. Statistical inference on parameters is based on (d...

Markov random fields serve as natural models for patterns or textures with random fluctuations at small scale. Given a general form of such fields each class of pattern corresponds to a collection of model parameters which critically determines the abilitity of algorithms to segment or classify. Statistical inference on parameters is based on (dependent) data given by a portion of patterns inside some observation window. Unfortunately, the corresponding maximum likelihood estimators are computationally intractable by classical methods. Until recently, they even were regarded as intractable at all. In recent years stochastic gradient algorithms for their computation were proposed and studied. An attractive class of such algorithms are those derived from adaptive algorithms, wellknown in engeneering for a long time. We derive convergence theorems following closely the lines proposed by M. M' etivier and P. Priouret (1987). This allows a transparent (albeit somewhat technical) treatment. . Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2009-04-12

Source:

ftp://ftp.stat.uni-muenchen.de/pub/sfb386/paper120.ps.Z

ftp://ftp.stat.uni-muenchen.de/pub/sfb386/paper120.ps.Z Minimize

Document Type:

text

Language:

en

Subjects:

adaptive algorithm ; stochastic approximation ; stochastic gradient descent ; MCMC methods ; maximum likelihood ; Gibbs fields ; imaging

adaptive algorithm ; stochastic approximation ; stochastic gradient descent ; MCMC methods ; maximum likelihood ; Gibbs fields ; imaging Minimize

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

Metadata may be used without restrictions as long as the oai identifier remains attached to it. Minimize

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Moment Sets of Bell-Shaped Distributions: Extreme Points, Extremal Decomposition and Chebysheff Inequalities

Description:

The paper deals with sets of distributions which are given by moment conditions for the distributions and convex constraints on derivatives of their c.d.fs. A general albeit simple method for the study of their extremal structure, extremal decomposition and topological or measure theoretical properties is developed. Its power is demonstrated by ...

The paper deals with sets of distributions which are given by moment conditions for the distributions and convex constraints on derivatives of their c.d.fs. A general albeit simple method for the study of their extremal structure, extremal decomposition and topological or measure theoretical properties is developed. Its power is demonstrated by the application to bell-shaped distributions. Extreme points of their moment sets are characterized completely (thus filling a gap in the previous theory) and inequalities of Tchebysheff type are derived by means of general integral representation theorems. Some key words: Moment sets, Tschebysheff inequalities, extremal bell-shaped distributions 1 Introduction This paper is devoted to the study of sets of distributions on the real line defined by both, moment constraints and convex constraints on derivatives (in the distributional sense). Of particular interest are their topological and measure theoretical properties and the characterization o. Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2009-04-13

Source:

ftp://ftp.stat.uni-muenchen.de/pub/sfb386/paper121.ps.Z

ftp://ftp.stat.uni-muenchen.de/pub/sfb386/paper121.ps.Z Minimize

Document Type:

text

Language:

en

DDC:

515 Analysis *(computed)*

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

Metadata may be used without restrictions as long as the oai identifier remains attached to it. Minimize

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

A Stochastic Algorithm For Maximum Likelihood Estimation In Imaging

Description:

Random elds serve as natural models for patterns with random uctuations. Given a parametric family of such elds each type of pattern corresponds to a specic vector of model parameters which critically determine the ability of algorithms to segment or classify. Inference on parameters is based on dependent data in some observation window. Maximum...

Random elds serve as natural models for patterns with random uctuations. Given a parametric family of such elds each type of pattern corresponds to a specic vector of model parameters which critically determine the ability of algorithms to segment or classify. Inference on parameters is based on dependent data in some observation window. Maximum likelihood estimators presently are not tractable by classical numerical methods. In recent years, stochastic gradient algorithms based on computationally feasible Markov chains were proposed. We derive approximation theorems following the lines of M. M etivier and P. Priouret (1987). This allows a fairly transparent treatment. We keep track of the relation between paths of the Markov chain algorithm and solutions of the underlying gradient system. 1 Introduction Random elds serve as exible models in image analysis and spatial statistics. In particular, any full probabilistic model of textures with random uctuations necessarily is a ra. Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2009-04-14

Source:

http://www.gsf.de/ibb/preprints/1998/pp98-07.ps

http://www.gsf.de/ibb/preprints/1998/pp98-07.ps Minimize

Document Type:

text

Language:

en

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

A STOCHASTIC ALGORITHM FOR MAXIMUM LIKELIHOOD ESTIMATION IN IMAGING

Description:

Abstract Random fields serve as natural models for patterns with random fluctuations. Given a parametric family of such fields each type of pattern corresponds to a specific vector of model parameters which critically determine the ability of algorithms to segment or classify. Inference on parameters is based on dependent data in some observatio...

Abstract Random fields serve as natural models for patterns with random fluctuations. Given a parametric family of such fields each type of pattern corresponds to a specific vector of model parameters which critically determine the ability of algorithms to segment or classify. Inference on parameters is based on dependent data in some observation window. Maximum likelihood estimators presently are not tractable by classical numerical methods. In recent years, stochastic gradient algorithms based on computationally feasible Markov chains were proposed. We derive approximation theorems following the lines of M. M'etivier and P. Priouret (1987). This allows a fairly transparent treatment. We keep track of the relation between paths of the Markov chain algorithm and solutions of the underlying gradient system. 1 Introduction Random fields serve as flexible models in image analysis and spatial statistics. In particular, any full probabilistic model of textures with random fluctuations necessarily is a random field. Recursive (auto-associative) neural networks can be reinterpreted in this framework as well, cf. [12] and [18]. Let a pattern or configuration be represented by an array x = (x Minimize

Contributors:

The Pennsylvania State University CiteSeerX Archives

Year of Publication:

2008-07-01

Source:

http://ibb.gsf.de/preprints/1998/pp98-07.ps

http://ibb.gsf.de/preprints/1998/pp98-07.ps Minimize

Document Type:

text

Language:

en

Rights:

Metadata may be used without restrictions as long as the oai identifier remains attached to it.

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Title:

Complexity L0-Penalized M-Estimation: Consistency in More Dimensions

Author:

Description:

We study the asymptotics in L2 for complexity penalized least squares regression for the discrete approximation of finite-dimensional signals on continuous domains—e.g., images—by piecewise smooth functions. We introduce a fairly general setting, which comprises most of the presently popular partitions of signal or image domains, like interval, ...

We study the asymptotics in L2 for complexity penalized least squares regression for the discrete approximation of finite-dimensional signals on continuous domains—e.g., images—by piecewise smooth functions. We introduce a fairly general setting, which comprises most of the presently popular partitions of signal or image domains, like interval, wedgelet or related partitions, as well as Delaunay triangulations. Then, we prove consistency and derive convergence rates. Finally, we illustrate by way of relevant examples that the abstract results are useful for many applications. Minimize

Publisher:

Multidisciplinary Digital Publishing Institute

Year of Publication:

2013-07-01T00:00:00Z

Document Type:

article

Language:

English

Subjects:

adaptive estimation ; penalized M-estimation ; Potts functional ; complexity penalized ; variational approach ; consistency ; convergence rates ; wedgelet partitions ; Delaunay triangulations ; LCC:Mathematics ; LCC:QA1-939 ; LCC:Science ; LCC:Q ; DOAJ:Mathematics ; DOAJ:Mathematics and Statistics ; LCC:Mathematics ; LCC:QA1-939 ; LCC:Science ; ...

adaptive estimation ; penalized M-estimation ; Potts functional ; complexity penalized ; variational approach ; consistency ; convergence rates ; wedgelet partitions ; Delaunay triangulations ; LCC:Mathematics ; LCC:QA1-939 ; LCC:Science ; LCC:Q ; DOAJ:Mathematics ; DOAJ:Mathematics and Statistics ; LCC:Mathematics ; LCC:QA1-939 ; LCC:Science ; LCC:Q ; DOAJ:Mathematics ; DOAJ:Mathematics and Statistics ; LCC:Mathematics ; LCC:QA1-939 ; LCC:Science ; LCC:Q Minimize

Rights:

CC by

CC by Minimize

Relations:

http://www.mdpi.com/2075-1680/2/3/311

URL:

Content Provider:

My Lists:

My Tags:

Notes:

Currently in BASE: 69,426,436 Documents of 3,331 Content Sources

http://www.base-search.net