Year 2006 Volume 3(30) no. 2
ON NOISE MODELLING IN PHYSICAL MODELLING APPROACH
Physical modelling is referred as the first representation of a process model and it is represented as a set of differential and algebraic equations. Noise added to model can improve the estimated behaviour of the process. Adding white noise to all variables is not recommended mainly because there could be variables based on the derivative of the white noise, and this is computational infeasible and physical impossible. The studied problem is to decide where is allowed to add noise from physical perspective early, at the modelling stage, and thus to avoid further numerical problems at the stage of simulation.
Process Modelling, Object-Oriented Modelling Languages, Neutral Modelling, Noise Modelling, DAE models.
AGENT BASED ROUTING URBAN DRIVING ADVISORY SYSTEM
Camelia Avram, Adina Astilean, Mihai Hulea, Tiberiu Letia, Honoriu Valean
The paper presents a hierarchical routing strategy for an Urban Driving Advisory System (UDAS), based on agent technology. UDAS assists the drivers to get the desired destination taking into account the current situation of traffic characteristics. It gives the estimated arrival time and the corresponding distance between a start and an arrival point. The necessary information is obtained from a real-time traffic control system (RTCS). The drivers could consult the advisory system using a variety of devices like mobile phones. The information given by the advisory system has the form of predefined short messages.
multiagent system, GSM communication, routing algorithms
INVESTIGATION ABOUT E-LEARNING SYSTEMS FOR RISK MANAGEMENT AND CONTROL
Ali Azarian, Vincent Brindejonc and Jean Marc Bruere
A helpful survey on e-lerning in the Risk Management domain has been performed in the context of the Vicaria project (Leonardo da Vinci - No PP 118018) which was aimed at providing a "blended learning" system based on a combination of different media in order to optimize the open distance learning in risk assessment domain. The issue of the above mentioned investigation is summarized partly in this paper.
software engineering, models, methods and tools, databases and multimedia, eLearning
A DISTRIBUTED SIMULATOR FOR NEURAL NETWORKS TRAINING
Sorin Babii and Vladimir Cretu
This paper presents a distributed simulator for neural networks – NetPar, and the results of several experiments in distributing the training phase of an artificial neural network. We developed and analyzed a distribution strategy for the back-propagation algorithm.
We describe a distributing procedure of the well-known algorithm of back-propagation, and implemented this algorithm on several networks of computers, which allowed us to evaluate and analyze the performances using the results of actual experiments. We were interested in the qualitative aspects, trying to understand the factors which determine the behavior of this distributed algorithm. We tried to emphasize some specific aspects to be considered when implementing such a parallel algorithm on a set of workstations, interconnected in a local area network. Also, we investigated the possibilities to exploit the computational resources of such a set of workstations.
neural networks, back-propagation algorithm, parallel algorithms, distributed algorithms, computers network.
LRU AS DICTIONARY REPLACEMENT POLICY
IN NEAR-LOSSLESS LZW IMAGE COMPRESSION
Macarie Breazu, Antoniu Pitic
Classical image compression methods are based on error measuring at entire
image level only. In some areas there is an obvious need for getting an upper bound for
the error at pixel level. In the paper we analyze such a near-lossless method based on
LZW dictionary algorithm and evaluate the LRU solution for dealing with the moment
when dictionary becomes full. The changes needed to adapt LZW to become a nearlossless
method are also presented. As far as we know our approach is the first attempt to
use LZW as a near-lossless method. Experimental results obtained and presented in the
paper prove that the LRU solution gives better than the solutions based on dictionary
freezing or clearing or on quadtree partitioning.
image, compression, near-lossless, quadro, LZW, LRU.
ENCAPSULATED MULTI-METHODS AND
MULTIPLE DISPATCHING IN OBJECT
ORIENTED HIGH LEVEL PETRI NETS
In recent years, several proposals tried to associate object-oriented
formalisms and Petri nets into a single framework which combine the expressive
power of both approaches. This paper presents a Petri net formalism called
Object Oriented High Level Petri Nets (OOHLPN), and the implementation of
encapsulated multi-methods and multiple dispatching in this formalism. The
encapsulated multi-methods and multiple dispatching are important feature of
OOHLPNs, in addition to separating the inheritance and subtyping hierarchies.
Petri nets, object-oriented Petri nets, object-oriented languages,
SUBTYPE AND INHERITANCE IN OBJECT
ORIENTED HIGH LEVEL PETRI NETS
Marius Brezovan and Eugen Ganea
It is well-known that both object-oriented paradigm and Petri net theory
are two powerful frameworks used to specify complex systems, each of them having
speci¯c advantages. In recent years, several proposals tried to associate them into
a single framework which combine the expressive power of both approaches. This
paper presents a Petri net formalism called Object Oriented High Level Petri
Nets (OOHLPN), and its connection with object-oriented methodologies. One
important feature of the OOHLPN formalism is treatment of the inheritance
and subtyping notions. OOHLPNs use distinct hierarchies for subtyping and
subclassing, the connection between a type hierarchy and an associated inheritance
hierarchy being a conformance relation.
Petri nets, object-oriented Petri nets, system modelling, formal
methods, algebraic speci¯cations
PARSING AND AMBIGUITY TESTING OF XML DOCUMENTS
The paper analyses the grammar of XML elements an presents algorithms
for testing ambiguity of elements and to check the structure of XML documents
versus the definition of the elements.
PEAQ – AN OBJECTIVE METHOD TO ASSESS THE PERCEPTUAL QUALITY OF
AUDIO COMPRESSED FILES
Dinu Câmpeanu and Andrei Câmpeanu
Up until now, the only way to measure the sound quality of modern audio coding
systems at low bit rates has been to implement elaborate listening tests with experienced
human test subjects. Consequently, the idea of substituting the subjective tests by objective,
computer based methods has been an ongoing focus of research and development. The result
was PEAQ, an algorithm (ITU-R Recommendation BS.1387-1), that uses a number of
psycho-acoustical tests combined to give a measure of the quality difference between two
instances of a signal (a reference and a test signal). The paper describes a MATLAB PEAQ
method implementation and the procedure used to assess it. Finally, PEAQ is used to rate the
quality of operation for some very known digital audio editors and audio codecs.
audio, quality, measurement, evaluation, perceptual, perceived, algorithm,
EXPERIMENTAL RESULTS ON CONTENT ANALYSIS
USING GOOGLE SET
Catalin Constantin Cerbulescu, Stefan Udristoiu,
Claudia Monica Cerbulescu
An important problem in both actual research and software development is content analyzing. The results of those researches are reflected in eliminating unimportant messages (spam-filter) and selecting the most important messages from a set. Present paper presents a content analyzer algorithm, implemented in a web-based application, among with his experimental results. According to an interest domain (defined by a set of words), the target content is analyzed. The result, a vector of floating numbers, is processed and analyzed, according to statistical methods. This approach is based on http://labs.google.com/sets to group words by importance and relevance.
pattern recognition, algorithms, data processing, machine learning, statistical Analysis.
EXPERIMENTS IN FUZZY IMAGE SEGMENTATION
Dorian Cojocaru, Razvan Tanasie, Elena Barbulescu
As a general concept, the job of the cluster analyze is the data partition into a
number of groups, or clusters. Applying this partitioning operation on images, the image
segmentation - a very important task for image processing - is obtained. This paper
presents a fuzzy segmentation algorithm, fuzzy c-means, applied to grayscale images.
computer vision, image processing, image segmentation, fuzzy logic.
HTML-AWARE TECHNIQUE FOR DATA CLEANING
Web pages created by human authors or dynamically generated using different
scripts and data stored in a back-end database frequently contain many common
mistakes. In order to obtain an acceptable result, web rendering engine component of a
browser and automatic tools for extracting information from HTML pages need a
preprocessing step to clean those web pages. We devised a simple algorithm whose main
task is to properly close the tags and transform a page from HTML format into a
bad data identification, data processing, transformations, html data, cleaning
QUERY LANGUAGES FOR ASSOCIATION RULES
Knowledge discovery from huge databases is one of the most actual problem. To solve it
we can use an inductive database. With an inductive database the analyst performs a set of very
different operations on data using a special-purpose language, powerful enough to perform all the
required manipulations, such as data preprocessing, pattern discovery and post-processing. In this
paper we present two query languages (MSQL and MINE RULE) that have been proposed for mining
the association rules and discuss their common features and differences.
association rules, query languages, data mining
WEB PRESENCE OF TRAVEL AGENCIES
FROM TRANSYLVANIA-ROMANIA AND HUNGARY
Vera-Melinda GÁLFI, Liciniu-Alexandru KOVÁCS,
,Ioan-Cristian CHIFU-OROS and Stefan MOLDOVAN
Tourism develops in all countries because it is a good source of income and,
sometimes, a better solution than to work elswere. But in a global economy, a travel
agent cannot promote tourism products and services only in a limited area. Wherever they
go to or come from, tourists want to be well informed. As a consequence, it is a good
starting point to visit and observe websites belonging to travel agencies from different
regions/countries and find out how the Web service of the Internet helps these processes.
Surprises may appear.
comparative study, travel agency, website, webpage, spreadsheet, chart,
CHARACTER RECOGNITION USING NEURAL NETWORKS
Eugen Ganea, Marius Brezovan, Simona Ganea
Numerous advances have been made in developing intelligent systems, some
inspired by biological networks. The paper discuses about then usefulness of neural
networks, more specifically the motivations behind the development of neural networks,
the outline network architectures and learning processes. We conclude with character
recognition, a successful layered neural network application.
neural network, training procedure, layered network, learning rule,
perceptron, back propagation, character recognition.
DESCRIPTIVE COMPOSITIONAL HSPN MODELING OF COMPUTER SYSTEMS
In this paper we define a set of composition operations and descriptive based
expressions to construction of composite Hybrid Stochastic Petri Nets (HSPN) for
performance discrete-continuous modeling of computer systems. We consider the
enhancements of our approach for a performance modeling of multiprocessor system.
Descriptive expressions, discrete-continuous modeling, hybrid stochastic Petri
nets, performance evaluation.
THE DESCRIPTION OF A NEW MEDICAL SOFTWARE TOOL FOR HOSPITALS
MANAGEMENT AND FINANCE
Liana Stanescu and
Dumitru Dan Burdescu
In this paper we present an efficient and flexible software system as an alternative for the
DRG-National application, used by the Romanian Government to finance the hospitals. This
alternative software system is in fact an on-line application based on JSP technology and a MySQL
database in order to replace the old application realized in MS Access 2000. This application will
help the DRG National Bureau to analyze, at any time, real-time data, and even send it to the
international organizations. The users can easily get accustom to this new application, because it
keeps the structure and many menus from the old application.
medical applications, management systems
MULTIMEDIA AND CRYPTOGRAPHY
In computer?s world is a permanently care of preserving secret an information,
to hide an individual information or to preserve the author?s right over the respective
data. To be complete this desideratum that in the time catches new dimensions and
difficulty grades, appear new methods base in principal on the stenography principle who
was been developed and improve.
information technologies, criptography, steganalys, multimedia.
FUNCTIONAL DIAGNOSIS FOR MEDICAL PURPOSES BY USING AN INTEGRATED INTELLIGENT SYSTEM
Isoc D, Pop M, Ignat A, Ionescu CM, De Keyser RMC
Medical diagnosis using technical means has many applications, especially when the information to be processed is non-homogeneous. Technical means are both instrument or apparatus and software tools. The main outcomes are efficiency and an accurate information processing. This work is dedicated to extent the diagnosis means by integrating together with instrument, apparatus, and software tools, the high qualified knowledge basis existing usually in great university clinics. This integration is achieved so that any member of medical qualified staff working directly with the patient can access a specialized knowledge base using an expert system. This expert system correlates the problem identified by the clinician during the primary investigation and the similar exi-sting cases during a large time horizon inside of the clinic.
The identified problem and the existing knowledge are associated inside of an intelligent predictive algorithm.
The suggested approach is applied in the field of lung diseases where all the necessary features and problems are available.
The outcome of intelligent integrating approach in the chosen application field empha-sizes that the results are interesting and economically efficient.
medical diagnosis, intelligent system, case-based inference, integrated intelli-gent system, intelligent prediction.
AUTOMATING THE DISPUTE RESOLUTION FOR B2B
Ioan Alfred Letia and Adrian Groza
The speed of supply chains formation requires new modes of resolving disputes within hard time constraints. Also, the design of punishment policies applied to specific domains linking agents’ actions to material penalties is an open research issue. In our framework the principles of contract law are applied to set penalties: expectation damages, opportunity cost, reliance damages, and party design remedies, and they are introduced in the task dependency model (Walsh and Wellman, 2003). The trust is supported by providing arguments for each imposed penalty.
artificial intelligence, agents, electronic contracts.
USING TRUST FOR DELEGATION IN MULTI-AGENT SYSTEMS
Ioan Alfred Letia and
Radu Razvan Slavescu
In this paper, we present a new approach for numerical trust evaluation, based
on it's cognitive components of competence and willingness and how the willingness
could be seen as a measure for the price of cooperation . We also show how this approach
could be extended to cover the case of re-delegation, i.e. delegating a task from one agent
to an intermediate one and then to the agent who actually does the job. We study how
trust will be distributed among the nodes of a social network, especially in the situation
when an agent possesses only partial information about the delegation chain topology and
about the competence and willingness of the intermediate agents.
artificial intelligence, agents, cooperation
MULTIAGENT SYSTEM FOR URBAN VEHICLE TRAFFIC CONTROL
Tiberiu Letia, Adina Astilean, Camelia Avram,
Mihai Hulea, Honoriu Valean
The urban vehicle traffic problems are presented with the most recent and relevant methods proposed to solve them. A solution based on the multiagent system that reacts continuously at the changing of the environment is developed. This adaptive distributed control system takes information from sensors and sends control signal to microcontroller to implement the phase durations. For the verification of the solution a real-time simulator that implements the relevant characteristics of the traffic was used.
traffic control, distributed control, adaptive control, real-time systems, intelligent control.
SOCIAL ANIMATED AGENTS: FUTURE OF INTELLIGENT TUTORING SYSTEMS?
Bogdan-Florin Marin, Axel Hunger, Stefan Werner
We present an approach based on the concepts of emotions, tutoring agents and
social positions to enhance and support the interaction between users within social
learning environments. It advocates the use of artificial agent societies as a complement
to human societies and assumes that agents will need to join such a society in order to
realise user’s learning goals. This work is a more detailed description of the framework
presented by Marin et al. (2004a).
Artificial Intelligence, Software Engineering, Databases.
DESIGN AND EVALUATION OF TWO PARALLEL SORTING ALGORITHMS BASED ON MPI TECHNOLOGY
Ioan Z. MIHU, Horia V. CAPRITA
The message-passing architectures consist of multiple computers interconnected through an interconnection network, communicating one with other by send-receive message functions and synchronized by barrier functions. At the moment there exist many libraries that provide a set of standardized functions for parallel programming like Message Passing Interface (MPI). MPI library allow the implementation of parallel algorithms on message-passing architectures. In this paper we propose two parallel sorting algorithms designed for message-passing architectures and implemented using MPI library: parallel Insertionsort and parallel Quicksort algorithms. We evaluate the performance of the proposed algorithms on three types of message passing architectures: linear array, two-dimensional mesh and hypercube. We evaluate the sorting time, the interprocesses communication time and the total processing time for each topology and we analyze the efficiency of the two parallel sorting algorithms related to the parallel system topology.
message passing architecture, parallel programming, parallel algorithms, sorting algorithms
A NEW INDEXING TECHNIQUE FOR DATA WAREHOUSES
Data warehouses are special-purpose databases to support decision-making. In data warehouses indexing is becoming a common feature to accelerate data mining searches that combine multiple restrictive queries. Different types of indexes had been proposed and some of them are already implemented. This paper proposes a new kind of join index structure, which can be useful for multiple join operations. The algorithm to build this type of index structure is presented. Some queries, which benefit from this kind of join index, are also presented.
indexing techniques, data warehouse, multiple join operations, relational databases.
MULTIMEDIA TECHNOLOGY INVOLVED IN E-LEARNING PLATFORM GENERATORS
Paulina Mitrea, Ovidiu Buza, Delia Mitrea
Our paper presents an eLearning portal generator environment with powerful MultiMedia facilities, consisting in: advanced multimedia objects handling techniques, synchronization procedures, graphic object oriented data structures, image processing elements (based on texture analysis), moving objects extraction from the video frames, etc. First we present the general design of the e-learning site generator environment and the general scheme of the major .jsp pages, after being detailed some multimedia object handling techniques, as well as the mathematical background used in the synchronization between the moving images and the audio-video sequences.
eLearning portal, MultiMedia technology, broadband connection, adaptive algorithm, embedded objects, synchronization, script file, tree datastructure, control kernel, histogram.
TEXTURE-BASED METHODS IN BIOMEDICAL IMAGE RECOGNITION OF DIFFUSE LIVER DISEASES
Delia Mitrea, Sergiu Nedevschi, Bogdan Fratila, Monica Lupsor
In this paper, our purpose is to do accurate analysis and recognition of ultrasound liver images, in order to identify diffuse liver diseases like steatosis, cirrhosis and hepatitis. In order to do a proper tissue analysis from ultrasound images, we compare the efficiency of texture-based methods like the Gray Level Cooccurrence Matrices (GLCM), fractals and the texton-based method.
ultrasound liver images, biomedical image recognition, decision making, accuracy, statistics, texture, GLCM, fractals, textons
CAPTURING THE TCP/IP TRAFFIC IN AN ETHERNET LAN
Mihai Mocanu, Mihai Dorobantu and Sorin Capra
Network traffic analysis is employed in order to provide continuously a
communication map for all the computers in a local network, necessary to network
administrators, analysts, developers a.s.o. In order to get detailed information any
network must be monitored. Library calls from a component added to the OS, called
WinPcap, can be used to capture network packets and to generate traffic. Traffic
generation can be then used to test software components (capture applications) or
hardware components (network adapters, modems) in a large variety of tools
(applications) specifically designed for network analysis, troubleshooting, security and/
or monitoring. Design principles for an efficient, multithreaded tool to do traffic analysis
and sniffing in an Ethernet LAN are presented in the final part of the paper.
packet capture, network analysis, libpcap, Winpcap
LIGHTWEIGHT WEB-BASED FRAMEWORK
FOR DIAGRAMMING TOOLS
Cristina Consuela Petre
Lightweight web-based diagramming tools are designed to provide easy and
rapid production of diagrams. The use of the web brought up advantages as high
portability and usability and the possibility of collaborative work. Although thin-client
design tools lack problems like installation or setup overhead, difficult to understand user
interfaces, building them is a challenging task. This paper evaluates the idea of realizing
an extension to a thick-client tool which enables any specified diagram designer to be
realized as a thin-client tool. The thin-client applications are accomplished by means of
server side components that interact with the thick-client designer to produce GIF
diagrams for display and editing in the user’s web browser. An analysis of the
effectiveness of the present approach is made and potential future development is
Thin-client, thick-client, web-based diagramming tool, web-browser
THREE-DIMENSIONAL RECONSTRUCTION OF RELIEFS WITH IRREGULAR FORMS STARTING FROM SERIAL SECTIONS
Serban B. PETRESCU, Tudor C. METEA, Catalin F. TUDOSE
Threedimensional reconstruction of geographic zones relief is a process with wide applications in GIS type products, especially those that implement models for simulation of various phenomenons that may affect a specific geographic area (floods, terrain slips etc). By threedimensional reconstruction of geographic area relief starting from serial sections, we understand regenerating of the area surface in the form of a set of connected triangles. The results of the reconstruction process may be introduced in variuos threedimensional vizualization programs, or used as an input for relief analysis programs.
Three-dimensional modelling, serial sections, level curves, polygon overlapping, tiling, triangularization, simple branching
3D VISUALIZATION REGISTRATION AND SEGMENTATION TOOL
Teo Popa and Mihai Mocanu
Medical imaging plays a crucial role in the diagnosis techniques upon which
modern medical treatment depends. In concert with the increasing number of imaging
techniques based on different physical principles and the availability of relatively
inexpensive computational resources, more and more medical software analysis tools are
developed. We present in this article an experimental visualization system called
MEVIAN (MEdical VIsualization & ANalysis) that we believe it may help the
radiologist in the diagnostic of liver pathology. This system is based on VTK, ITK and
FLTK open source libraries that are freely available for download and provides the vital
components for diagnosis of liver lesions and a reliable differential diagnosis. Using this
tool the radiologist can perform accurate preoperative localization of the focal or diffuse
lesion, both segmental and topographical.
medical imaging, open source, VTK, ITK, FLTK.
ACTORS AND STRUCTURE OF MODERN E-LEARNING SYSTEMS
E-learning is the continuous assimilation of knowledge and skills by adults stimulated by synchronous and asynchronous learning events — and sometimes Knowledge Management outputs — which are authored, delivered, engaged with, supported, and administered using Internet technologies. The first e-learning systems were actualy web sites which contained some electronic materials and primitive tests. Also, there were few categories of actors within those systems. Modern age e-learning systems are based on the notion of lesson. The people involved are grouped within several categories. The present paper tries to present those categories and also to highlight the structure of the main entity of modern e-learning systems – the lesson.
learning systems, computer applications, computer systems, computer-aided instruction
CONGAXPERT – AN ONTOLOGY
BASED ONLINE EXPERT SYSTEM
Rosu Marius, Phan Cong Chinh, Stefan Freinatis
Today's expert systems deal with domains of narrow specialization and the set
of observable facts is limited. When trying to enlarge the specialization domain, the
limited number of facts becomes inaccurate. This paper presents an attempt to solve the
inaccuracy problem of wide domain expert system by modeling the domain of knowledge
and formalizing the reasoning process. The intelligence of the systems is used for
selection of possible observable facts as well as for the prediction or diagnostics function
of the expert system.
rule based expert system, ontology, intelligent dialogue.
E-COMMERCE SUPERMAKET SYSTEM: CONCEPTS AND IMPACT
Cosmin Stoica Spahiu, Costin Badica, Gabriel Vladut and
Michael J. Roberts
This article presents an original implementation of an e-commerce supermarket system – SUM. To be competitive in today’s world, manufacturers need to enrich the range of products they can sell in a make-to-order environment, to minimize the delivery time to the client and to maximize the production in order to improve profitability. The best solution is to eliminate the retailers and to sell directly to the clients, using specialized software that groups the received orders in batches, on family of products. The paper briefly summarizes the software architecture of the SUM system, the concepts and impact of the software.
e-commerce, make-to-order, supply chain, management, software architecture
USING DYNAMIC TECHNIQUES IN COMPUTER GRAPHICS APPLICATIONS
Tanasie Razvan, Tunaru Cristina
This paper focuses on animation creation based on time dependent transformations in a real case and presents a solution that uses computer graphics techniques in interactive applications. The application is elaborated starting with a story theme and every character or element of the scene has a transformation matrix attached that defines the final movement of the object.
computer graphics, rendering, DirectX, mesh, vision.
CREATING AND IMPLEMENTING PROJECT-BASED E-LEARNING SCENARIOS
Philippe Trigano, Elvira Popescu, Ecaterina Giacomini
Since the project-based learning approach is advocated in the literature as one of the most effective methods of learning, our goal is to design a series of pedagogical scenarios involving project developing. IMS Learning Design specification proves very effective for this task, providing facilities for both individual project-based learning as well as for collaborative work, in case of projects done in groups. Our research is accomplished in the context of netUniversité e-learning platform, which integrates the functionality of both a Learning Management System and a Learning Content Management System. This paper covers the process of modelling two project-based pedagogical scenarios using IMS LD and their implementation in the netUniversité platform.
project-based learning, collaborative work, instructional design, IMS Learning Design specification, educational hypermedia.
A FRAMEWORK FOR MODELING AND EVALUATING TIMING BEHAVIOUR
FOR REAL-TIME SYSTEMS
Zmaranda Doina, Rusu Claudia, Gligor Marius
In this paper we present a tool for modelling and analyzing real-time systems. The description of the system being modelled is based on system’s timing characteristics. Using the implemented tool, several quantitative timing information about the system such as schedulability, response time and loading factor could be estimated. Based on the developed simulated model of a real-time system, the timing behavior could be assessed prior to implementation.
Real-time system, deadline, schedulability, Worst Case Execution Time (WCET).