Scientific Journals and Yearbooks Published at SAS

Article List

Computing and Informatics

Volume 26, 2007, No. 1


  An Algorithm for the Generation of Segmented Parametric Software Estimation Models and Its Empirical Evaluation

Parametric software estimation, software project databases, clustering algorithms, EM algorithm

Parametric software effort estimation techniques use mathematical cost-estimation relationships derived from historical project databases, usually obtained through standard curve regression techniques. Nonetheless, project databases -- especially in the case of consortium-created compilations like the ISBSG --, collect highly heterogeneous data, coming from projects that diverge in size, process and personnel skills, among other factors. This results in that a single parametric model is seldom able to capture the diversity of the sources, in turn resulting in poor overall quality. Segmented parametric estimation models use local regression to derive one model per each segment of data with similar characteristics, improving the overall predictive quality of parametrics. Further, the process of obtaining segmented models can be expressed in the form of a generic algorithm that can be used to produce candidate models in an automated process of calibration from the project database at hand. This paper describes the rationale for such algorithmic scheme along with the empirical evaluation of a concrete version that uses the EM clustering algorithm combined with the common parametric exponential model of size-effort, and standard quality-of-adjustment criteria. Results point out to the adequacy of the technique as an extension of existing single-relation models.

Computing and Informatics. Volume 26, 2007, No. 1: 1-15.

  Image Segmentation by Fuzzy C-Means Clustering Algorithm with a Novel Penalty Term
Yong YANG, Shuying HUANG

Fuzzy c-means, clustering, image segmentation, expectation maximization

To overcome the noise sensitiveness of conventional fuzzy c-means (FCM) clustering algorithm, a novel extended FCM algorithm for image segmentation is presented in this paper. The algorithm is developed by modifying the objective function of the standard FCM algorithm with a penalty term that takes into account the influence of the neighboring pixels on the centre pixels. The penalty term acts as a regularizer in this algorithm, which is inspired from the neighborhood expectation maximization algorithm and is modified in order to satisfy the criterion of the FCM algorithm. The performance of our algorithm is discussed and compared to those of many derivatives of FCM algorithm. Experimental results on segmentation of synthetic and real images demonstrate that the proposed algorithm is effective and robust.

Computing and Informatics. Volume 26, 2007, No. 1: 17-31.

  Evolving Generalized Euclidean Distances for Training RBNN

Generalized distances, evolving distances, radial basis neural networks, genetic algorithms

In Radial Basis Neural Networks (RBNN), the activation of each neuron depends on the Euclidean distance between a pattern and the neuron center. Such a symmetrical activation assumes that all attributes are equally relevant, which might not be true. Non-symmetrical distances like Mahalanobis can be used. However, this distance is computed directly from the data covariance matrix and therefore the accuracy of the learning algorithm is not taken into account. In this paper, we propose to use a Genetic Algorithm to search for a generalized Euclidean distance matrix, that minimizes the error produced by a RBNN.

Computing and Informatics. Volume 26, 2007, No. 1: 33-43.

  Multilayer Perceptrons and Data Compression
Robert MANGER, Krunoslav PULJIC

Artificial neural networks, data compression, multilayer perceptrons, holographic neural networks, experiments

This paper investigates the feasibility of using artificial neural networks as a tool for data compression. More precisely, the paper measures compression capabilities of the standard multilayer perceptrons. An outline of a possible "neural'' data compression method is given. The method is based on training a perceptron to reproduce a given data file. Experiments are presented, where the outlined method has been simulated by using differently configured perceptrons and various data files. The best compression rates obtained in the experiments are listed, and compared with similar results produced in a previous paper by holographic neural networks.

Computing and Informatics. Volume 26, 2007, No. 1: 45-62.

  Computed Answer from Uncertain Knowledge: A Model for Handling Uncertain Information
Agnes ACHS

Uncertainty modelling, knowledge-based systems, fuzzy sets

In this work we present a model for handling uncertain information. The concept of fuzzy knowledge-base is defined as a quadruple of background knowledge. Specifically, the latter is defined by the proximity of predicates and terms; a deduction mechanism: a fuzzy Datalog program; a connecting algorithm, which connects the background knowledge with the program, and a decoding set of the program which helps us determine the uncertainty level of the results. We also suggest a possible evaluation strategy.

Computing and Informatics. Volume 26, 2007, No. 1: 63-76.

  On High-Rate Cryptographic Compression Functions

Hash functions, compression functions, block ciphers, provable security

The security of iterated hash functions relies on the properties of underlying compression functions. We study highly efficient compression functions based on block ciphers. We propose a model for high-rate compression functions, and give an upper bound for the rate of any collision resistant compression function in our model. In addition, we show that natural generalizations of constructions by Preneel, Govaerts, and Vandewalle to the case of rate-2 compression functions are not collision resistant.

Computing and Informatics. Volume 26, 2007, No. 1: 77-87.

  Overhead Verification for Cryptographically Secured Transmission on the Grid
Wojciech RZąSA, Marian BUBAK, Bartosz BALIś, Tomasz SZEPIENIEC

Grid, security, GSI, application monitoring, OCM-G

It is well known that the network protocols frequently used in Internet and Local Area Networks do not ensure the security level required for current distributed applications. This is even more crucial for the Grid environment. Therefore asymmetric cryptography algorithms have been applied in order to secure information transmitted over the network. The security level enforced by means of the algorithms is found sufficient, however it introduces additional transmission overhead. In this paper we describe experiments performed in order to evaluate transmission efficiency depending on the security level applied.

Computing and Informatics. Volume 26, 2007, No. 1: 2007-101.