Katerina Goseva-Popstojanova, Ahmed Hassan, Ajith Guedem, Walid Abdelmoez, Diaa Eldin M. Nassar, Hany H. Ammar, and Ali Mili , “Architectural-Level Risk Analysis Using UML,” IEEE Transactions on Software Engineering, October 2003
Robert David Cowan, Ali Mili, Hany H. Ammar, Alan McKendall, Jr., Lin Yang, Dapeng Chen, and Terry Spencer, “Software Engineering Technology Watch,” IEEE Software, July 2002
Mark Shereshevsky, Habib Ammari, Nicholay Gradetsky, Ali Mili and Hany H. Ammar, “Information Theoretic Metrics for Software Architectures,” International Computer Software and Applications Conference (COMPSAC 2001), IEEE Computer Society, Chicago, IL., October 2001
Sherif M. Yacoub, and Hany H. Ammar, “ UML Support for Designing Software systems as a Composiiton of design Patterns,” UML 2001, , Lecture Series in computer science, Springer-Verlang Toronto, Ont., Canada, October 2001
Rania M. Elnaggar, Vittorio Cortellessa, Hany Ammar, “A UML-based Architectural Model for Timing and Performance Analyses of GSM Radio Subsystem” , 5th World Multi-Conference on Systems, Cybernetics and Informatics, July. 2001, Received Best Paper Award
Ahmed Hassan, Walid M. Abdelmoez, Rania M. Elnaggar, Hany H. Ammar, “An Approach to Measure the Quality of Software Designs from UML Specifications,” 5th World Multi-Conference on Systems, Cybernetics and Informatics and the 7th international conference on information systems, analysis and synthesis ISAS July. 2001.
Hany H. Ammar, Vittorio Cortellessa, Alaa Ibrahim “Modeling Resources in a UML-based Simulative Environment”, ACS/IEEE International Conference on Computer Systems and Applications (AICCSA'2001), Beirut, Lebanon, 26-29 June 2001
L. Lakshmi K. Bhetanabhotla, N. Bussa, A. Mohammad, O. Abdalla, and Hany H. Ammar, “UML ARCHITECTURE OF A WEB-BASED INTERACTIVE COURSE TOOL,” The World Internet and Electronic Cities Conference 2001, May 1-3, 2001 KISH ISLAND – IRAN
, Hany H. Ammar, Xuhui Zhen, Diaa Nassar, Yingzi Jin “Web-Based Test Bed for Fingerprint Image Comparison” The World Internet and Electronic Cities Conference 2001, May 1-3, 2001 KISH ISLAND – IRAN , Received Best Paper Award.
Alaa Ibrahim, Sherif M. Yacoub, Hany H. Ammar, “Architectural-Level Risk Analysis for UML Dynamic Specifications,” Proceedings of the 9th International Conference on Software Quality Management (SQM2001), Loughborough University, England, April 18-20, 2001, pp. 179-190
A UML Model for Analyzing Software Quality, in Proc. International Conference on Reliability and Quality in Design, ISSAT, Orlando, Fl, August, 2000, pp85-89
To measure the quality of a software product, we have to define quality in terms of attributes that we desire to measure. Those desirable attributes (external attributes) are usually different from what we are actually able to measure from the artifacts produced along the software development process (internal attributes). To understand the relationship between external and internal attributes and to develop a successful measurement process, we have to develop software quality measurement models that distinguish between what we can measure and what we desire to measure.
In this paper, we develop a model for software quality using the Unified Modeling Language (UML). In this model, we adopt a three-level hierarchy of relevant features, whereby we distinguish between qualitative attributes (that are externally observable, and most relevant from a user's viewpoint), quantitative factors (that have a precise mathematical formula, but may be difficult to evaluate), and computable metrics (that are internal, product-specific quantities that can be evaluated from the software artifacts)
Verification of UML Dynamic Specifications using Simulation-based Timing Analysis, in Proc. International Conference on Reliability and Quality in Design, ISSAT, Orlando, Fl, August, 2000, pp 65-69
The Unified Modeling Language (UML) is the result of the unification process of earlier object oriented models and notations. Independent verification and validation (IV&V) tasks, as applied to UML specifications, enable early detection of analysis and design flaws prior to implementation.
In this paper, we address an important IV&V task that we perform on UML models, which is timing analysis of UML dynamic specifications. We discuss an approach for automatic generation of timing diagrams from the simulation logs obtained from simulating UML specifications. We develop four timing analysis methods, namely; concurrency-based, environmental-interactions, timeouts-based, and performance-based timing analysis methods. We show results from applying the proposed timing analysis methods to an illustrative example, a pacemaker specification.
COTS-supported Web-based Interactive Electronic Technical Manual Architecture Using UML, In Proceedings of the 2nd Symposium on Reusable Architectures and Components for Developing Distributed Information Systems (RACDIS'99), Orlando, Florida, July 2000.
FINGERPRINT REGISTRATION USING GENETIC ALGORITHMS, in Proc. of Application Specific Software Engineering Technology ASSET'2000, IEEE Computer Society, Dallas, Texas, March 2000
In automated fingerprint identification systems, an efficient and accurate alignment
algorithm in the preprocessing stage plays a crucial role in the performance of the whole
system. In this paper, we explore the use of genetic algorithms for optimizing the
alignment of a pair of fingerprint images. To test its performance, we compare the
implemented genetic algorithm with two other algorithms, namely, a 2D and 3D
algorithms. Based upon our experiment on 250 pairs of fingerprint images, we find that:
1) genetic algorithms run ten times faster that 3D algorithm with similar alignment
accuracy, and 2) genetic algorithms are 13% more accurate than 2D algorithm, with same
running time. The conclusion drawn from this study is that a genetic algorithm approach
is an efficient and effective approach for fingerprint image registration.
Automating the Development of Pattern-Oriented Designs, in Proc. of Application Specific Software Engineering Technology ASSET'2000, IEEE Computer Society, Dallas, Texas, March 2000
"Scenario-based Reliability Analysis of Component-Based Software", in Proceedings of the Tenth International Symposium on Software Reliability Engineering, ISSRE'99, Boca Raton, Florida USA, November 1-4 1999, pp22-31
Software designers are motivated to utilize off-the-shelf software components for rapid application
development. Such applications are expected to have high reliability as a result of deploying trusted
components. The claims of high reliability need further investigation based on reliability estimation models
and techniques that are applicable to component-based applications.
This paper introduces a probabilistic model and a reliability estimation and analysis technique applicable to
high-level designs. The technique is named "Scenario-Based Reliability Estimation" (SBRE). SBRE is
specific for component-based software whose analysis is strictly based on execution scenarios. Using
scenarios, we construct a probabilistic model named "Component-Dependency Graph" (CDG). CDGs are
directed graphs that represent components, component reliabilities, link and interface reliabilities,
transitions, transition probabilities, and average execution times of components. In CDGs, component
interfaces and link reliabilities are treated as first class element of the model. Based on CDGs, an algorithm
is presented to analyze the reliability of the application as function of the reliability of its components and
interfaces. A case study illustrates the applicability of the algorithm. The SBRE algorithm is used to identify
critical components and critical component interfaces by investigating the sensitivity of the application
reliability to changes in the reliabilities of components and their interfaces.
Keywords: Component-Based software, Reliability Analysis, Reliability Modeling, Component-Dependency
"Dynamic Metrics for Object Oriented Designs", In Proceedings of the Sixth International Symposium on Software Metrics, Metrics'99, Boca Raton, Florida USA, November 4-6 1999, pp50-61.As object oriented analysis and design techniques become widely used, the demand on assessing the quality of object-oriented designs substantially increases. Recently, there has been much research effort to develop and empirically validate metrics for OO design quality. Complexity, coupling, and cohesion have received a considerable interest in the field. Despite the rich body of research and practice in developing design quality metrics, there has been less emphasis on dynamic metrics for object-oriented designs. The complex dynamic behavior of many real-time applications motivates a shift in interest from traditional static metrics to dynamic metrics.
This paper addresses the problem of measuring the quality of object-oriented designs using dynamic metrics. We present a metrics suite to measure the quality of designs at an early development phase. The suite consists of metrics for dynamic complexity and object coupling based on execution scenarios. The proposed measures are obtained from executable design models. We apply the dynamic metrics to assess the quality of a "Pacemaker" application. Results from the case study are used to compare static metrics to the proposed dynamic metrics and hence identify the need for empirical studies to explore the dependency of design quality on each.
Keywords: Dynamic Metrics, Design Quality, Object-Oriented Designs, and Real-Time OO Modeling
, "An Integrated Tool Environment for DoD Product Line Engineering" In Proceedings of the 1st Symposium on Reusable Architectures and Components for Developing Distributed Information Systems (RACDIS'99), Orlando, Florida, August 2-3, 1999, pp 618-620
"Tool Support for Developing Pattern-Oriented Architectures", In Proceedings of the 1st Symposium on Reusable Architectures and Components for Developing Distributed Information Systems (RACDIS'99), Orlando, Florida, August 2-3, 1999, Vol I, pp 665-670.
Design patterns have recently attracted the interest of researchers and practitioner as reusable proven-solutions to frequent design problems. Deploying these solutions to develop complex information systems is a tedious task that involves integration issues and iterative development. A tool-support for development with patterns will eventually facilitate the analysis and design phases. Current modeling tools do not explicitly support pattern as architecture construct with interfaces.
This paper discusses the requirement specification of a visual composition tool that supports the development of object oriented architectures from constructive design patterns. The notion of pattern interfaces is made explicit to integrate patterns at the architecture level. The tool facilitates the development of architectures for information systems such as the Client/Server architecture for distributed medical informatics systems. Current visual modeling languages and their tool support do not explicitly incorporate the new concepts of pattern diagrams and pattern interfaces. The proposed tool supports high-level designs using patterns as design components with interfaces, and integrates with existing tools for lower level designs in terms of class and collaboration diagrams. This paper reports on the specification of a tool support for designing with patterns as architecture constructs.
"Toward an Integrated Approach to Systems Design", in Proceedings of Photonics East, PE'99, Intelligent Systems in Design and Manufacturing II, SPIE The International Society for Optical Engineering. Hynes Convention Center, Boston, USA, September 19-22, 1999, Vol. 3833-12, pp69-76..
Sherif M. Yacoub, Hany H. Ammar “The Development of a Client/Server Architecture for Standardized Medical Application Network Services” IEEE Transactions on Software Engineering, March 1999
"Risk Assessment of Functional
Specification of Software Systems Using Colored Petri Nets,"
Proceedings of the International Symp. On Software Reliability Engineering (ISSRE'97), IEEE Comp. Soc., November 1997.
This paper presents an example of risk assessment in complex real-time software systems at the early stages of development. A heuristic risk assessment technique based on Colored Petri Nets (CPN) Models is used to classify software according to their relative importance in terms of such factors as severity and complexity. The methodology of this technique is presented in a companion paper in . This technique is applied on the Earth Operation Commanding Center (EOC _COMMANING); a large component of NASA’s Earth Observing System (EOS) project. Two specifications of the system are considered: a sequential model and a pipeline model. Results of applying the above technique to both CPN-based models yield different complexity measures. The pipeline model shows clearly a higher risk factor than the sequential model. Whereas using traditional complexity measures, the risk factors were similar in both models. components with high risk factor which would require the development of effective fault tolerance mechanisms.
A Methodology for Risk Assessment and Performability Analysis of Large Scale Software Systems
International Conferance on Engineering Mathematics and Physics. Cairo Egypt, Dec, 1997
This paper describes a methodology for modeling and analysis of large scale software specifications of concurrent real-time systems. Two types of analysis, namely, risk assessment and performability analysis are presented. Both types of analysis are based on simulations of Colored Petri Nets (CPN) software specification models. These CPN models are mapped from the software specifications originally developed using Computer-Aided Software Engineering (CASE) tools. Thus the methodology lends itself to a three step process. In the first step CASE based models are mapped to the CPN notation. The CPN models are completed for scenario based simulations in the second step. Finally in the third step the models are simulated for risk assessment and performability analysis. A model of a large industrial scale software specifications is presented to illustrate the usefulness of this approach. The model is based on a component of NASA’s Earth Observing System (EOS).
A Methodology For Risk Assessment of Functional Specification of Software Systems Using Colored Petri Nets
International Symp. on Software Metrics, IEEE Computer Soc., Nov. 1997
(Word Doc file,95k)
This paper presents a methodology for risk assessment in complex real-time software systems at the early stages of development, namely the analysis/design phase. A heuristic risk assessment technique is described based on Colored Petri Nets (CPN) Models. The technique uses complexity metrics and severity measures in developing a heuristic risk factor from software functional specifications. The objective of risk assessment is to classify the software components according to their relative importance in terms of such factors as severity and complexity. Both traditional static and dynamic complexity measures are supported. Concurrency complexity, is presented as a new dynamic complexity metric. This metric measures the added dynamic complexity due to concurrency in the system. Severity analysis is conducted using failure mode and effect analysis (FMEA).
Performability Analysis of the Commanding Component of NASA’s Earth Observing System
The 10th International Conf. on Parallel and Distributed Computing, New Orleans, Oct. 1997
The objective of this work is to develop methods and techniques for generating verification and analysis models from notations used for Parallel and Distributed Systems specifications. The resulting verification models can be subjected to extensive and exhaustive verification of the requirement specifications.
This paper presents an application of the methodology developed by us to integrate a CASE environment based on SART (Structured Analysis with Real Time) notation and CPN (Coloured Petri Nets) based verification environment.
Parallel Algorithms for an Automated Fingerprint Image Comparison System
International Symp. on Parallel and Distributed Processing (SPDP'96), IEEE Computer Soc., Oct. 1996
This paper addresses the problem of developing effcient parallel algorithms for the training procedure of a neural network based Fingerprint Image Comparison (FIC) system. The target architecture is assumed to be a coarse-grain distributed memory parallel architecture. Two types of parallelism: node parallelism and training set parallelism (TSP) are investigated. Theoretical analysis and experimental results show that node parallelism has low speedup and poor scalability, while TSP proves to have the best speedup performance. TSP, however, is amenable to a slow convergence rate. In order to reduce this effect, a modifed training set parallel algorithm using weighted contributions of synaptic connections is proposed. Experimental results show that this algorithm provides a fast convergence rate, while keeping the best speedup performance obtained.
Implementation of a Training Set Parallel Algorithm for an Automated Fingerprint Image Comparison System
International Conf. on Parallel Processing(ICPP'96), IEEE Computer Soc., Aug. 1996
This paper addresses the problem of implementing a training set parallel algorithm (TSPA) for the training procedure of a neural network based Fingerprint Image Comparison (FIC) system. Experimental results on a 32 node CM-5 system show that TSPA achieves almost linear speedup performance. This parallel algorithm is applicable to ANN training in general and is not dependent on the ANN architecture.