Abstract:
In this paper, we elaborate on a new design approach to the development and analysis of granular input spaces and ensuing granular modeling. Given a numeric model (no matter what specific design methodology has been used to construct it and what architecture has been adopted), we form a granular input space through allocating a certain level of information granularity across the input variables. The formation of granular input space helps us gain a better insight into the ranking of input variables with respect to their precision (the variables with a lower level of information granularity need to be specified in a precise way when estimating the inputs). As a consequence, for granular inputs, the outputs of the granular model are also information granules (say, intervals, fuzzy sets, rough sets, etc.). It is shown that the process of forming granular input space can be sought as an optimization of allocation of information granularity across the input variables so that the specificity of the corresponding granular outputs of the granular model becomes the highest while coverage of data becomes maximized. The construction of granular input space dwells upon two fundamental principles of granular computing-the principle of justifiable granularity and the optimal allocation of information granularity. The quality of the granular input space is quantified in terms of the two conflicting criteria, that is, the specificity of the results produced by the granular model and the coverage of experimental data delivered by this model. In the ensuing optimization problem, one maximizes a product of specificity and coverage. Differential evolution is engaged in this optimization task. The experimental studies involve both synthetic dataset and data coming from the machine learning repository.