In the mathematical theory of decisions, decision-theoretic rough sets (DTRS) is a probabilistic extension of rough set classification.
First created in 1990 by Dr. Yiyu Yao,[1] the extension makes use of loss functions to derive
region parameters.
Like rough sets, the lower and upper approximations of a set are used.
The following contains the basic principles of decision-theoretic rough sets.
Using the Bayesian decision procedure, the decision-theoretic rough set (DTRS) approach allows for minimum-risk decision making based on observed evidence.
be a finite set of
is calculated as the conditional probability of an object
denotes the loss, or cost, for performing action
The expected loss (conditional risk) associated with taking action
is given by: Object classification with the approximation operators can be fitted into the Bayesian decision framework.
The set of actions is given by
represent the three actions in classifying an object into POS(
denote the loss incurred by taking action
when an object belongs to
denote the loss incurred by take the same action when the object belongs to
denote the loss function for classifying an object in
into the POS region,
denote the loss function for classifying an object in
denote the loss function for classifying an object in
denotes the loss of classifying an object that does not belong to
Taking individual can be associated with the expected loss
{\displaystyle \textstyle \lambda _{PP}\leq \lambda _{BP}<\lambda _{NP}}
{\displaystyle \textstyle \lambda _{NN}\leq \lambda _{BN}<\lambda _{PN}}
, the following decision rules are formulated (P, N, B): where, The
values define the three different regions, giving us an associated risk for classifying an object.
α > γ > β
α = β = γ
, we can simplify the rules (P-B) into (P2-B2), which divide the regions based solely on
: Data mining, feature selection, information retrieval, and classifications are just some of the applications in which the DTRS approach has been successfully used.