ISSN 2253-0150
Editor-in-Chief :
Mohamed Ridda LAOUAR
Table of contents:
Volume 1 issue 1 - Current Issue
Published: 2013
Articles
Editorial
- Aris M. OUKSEL, The University of Illinois, USA
BIG DATA: SOME RESEARCH CHALLENGES IN FUTURE INFORMATION SYSTEMS (1-3)
Show/Hide Abstract
In this inaugural issue of the journal on information systems, it is appropriate to
reflect on some of the main challenges in this field and some of the desired topics
that publishing authors will investigate. Computers, including phones, sensors and
scientific instruments, and networks have brought revolutionary information
technologies to create a digital space whose impact on humanity is being felt
everywhere. Currently, over half a billion people log into Facebook to communicate with their
contacts. They exchange more than 300 million photos and more than 3 billion votes
and comments each day. Every server, device and system generates an everchanging
stream of information in social media, business and scientific applications.
These phenomena will accelerate over time with even richer and more
heterogeneous information. In this environment, there is need for analytics to make
sense of the volume, diversity, complexity, uncertainty of this data that is being
generated at a fast pace. The main challenge will be to design and implement
feature-extraction systems that can label media streaming such as images and
videos; free-form tweets; text messages; distributed Internet monitors; traffic
monitoring systems; E-mail spam generators and mutators; network firewall logs,
blogs and documents ; business and scientific documents. The goal of these systems, ultimately, will be to enable extracting insight and value
from this abundant resource (big data) and to support understanding of and
solutions to societal or business problems, from improving productivity and
efficiency, to creating new economic opportunities in a competitive environment,
and to enabling the discovery of new approaches and solutions in business,
medicine, science and the humanities. These trends highlight a symbiotic
relationship between large-scale data management and software-defined
networking. In particular, the main challenges will be the development of new
machine learning algorithms that can operate online and at large scale and can give
flexible tradeoffs between timeliness, accuracy and cost ; systems infrastructure
approaches that allow programmers to easily harness the power of scalable cloud and cluster computing for making sense of data ; crowd sourcing human activity and
intelligence to create hybrid human/computer solutions to complex problems for
which today's automated data analysis technologies are insufficient. In business, it is generally understood that there is a strong link between effective
data management and financial performance. Yet the extraction of economic insight
from Big Data remains elusive for most organizations. Heterogeneity, scale,
timeliness, complexity and privacy problems with Big Data impede progress at all
phases of the pipeline. Many organizations struggle with basic aspects of data
management such as cleaning, verifying and reconciling data across the
organisation.
Additionally, the transformation of complex content into structured format for later
analysis remains a daunting challenge. The value of data increases exponentially
when its relationship to other intra and inter-organizational data needs to be
exploited. Data integration/aggregation is a creator of added-value, but its effective
implementation, with its attendant difficulty in semantic convergence, remains an
enormous challenge. The ubiquitousness of digital devices provides an opportunity
to influence and control both data format and accuracy to facilitate later automatic
linkages.
- Binghui Helen WU, FS Consulting, Wichita, Kansas, USA
DYNAMIC ANALYSIS OF SOFTWARE REQUIREMENT (4-16)
Show/Hide Abstract
The most salient nature of software as opposed to hardware is its endurance and adaptability to changes for the better. A software product can be delivered
to its end users with minimal marketable features initially. It matures while being used. Therefore it is important to perform dynamic analysis of software requirements
to guide its maturation efficiently during its life span. The question accompanying a dynamic analysis of software requirements shoud be, "what can be done?" and
"how should we implement it?"
This paper presents a general approach to the dynamic analysis of software requirements with two projects that the author worked in the past as examples. The paper
argues as to why dynamic analysis applying to software requirements is twofold to fulfill customer requests tactically and to improve the quality of products
strategically.
- Guilherme GOEHRINGER and Abraham ALCAIM, Cetuc, Puc-Rio, Brazil
FAST MOTION ADAPTIVE ESTIMATION ALGORITHM APPLIED TO
H.264/AVC (17-37)
Show/Hide Abstract
The motion estimation techniques used by video compression standards allowan efficient use of transmission and storage resources. In this paper, we propose
a new algorithm that reduces the computational load involved, without deteriorating the quality of the reconstructed signal.
The new algorithm called AUMHexagonS (Adaptive Unsymmetrical -cross multi -Hexagon-gird Search) is a modification of the UMHexagonS (Unsymmetrical -cross multi -Hexagon-gird Search)
that implements a measure that classifies the scene s of a video sequence according to its motion intensity. This motion intensity is used for
better operation of the motion estimation steps and for better useof some important H.264/AVC codec parameters.
- Wided BAKARI, MOUEZ Ali and Hanene BENABDALLAH, MIRACL, University of Sfax, Tunisia
AUTOMATIC APPROACH FOR GENERATING ETL OPERATORS (38-48)
Show/Hide Abstract
This article deals with the generation of the ETL operators (Extract Transform Load) for the purpose of feeding a data warehouse using a relational data source.
This approach enables the designer to define some conditions necessary for the loading.
- Omer AbdAlkareem Jasim, Al-Ma'arif University College, Ramadi, Anbar, Iraq
THE INTEGRATION BETWEEN QUANTUM KEY DISTRIBUTION SYSTEM
AND NETWORK SECURITY CONCEPT (49-60)
Show/Hide Abstract
Quantum theory introduced us with new methods of keys exchange. The security of those methods has been
investigated in the last few years, reaching several different results. In this study one present an overview of
quantum key distribution (QKD) protocol and the widespread internet security applications, IPsec and TLS.
The research proposed how QKD could be integrated into these security applications. We also note that
existing security protocols could be used to authenticate and integrity protect QKD protocol messages, but care
must be taken to avoid the use of quantum keys before they exist. Finally we discussed a QKD service interface
between the QKD protocol and the security applications. This interface provides a set of common QKD services
needed by existing security protocols.
- HADI NADIA, BELALEM GHALEM, DOUDOU NAWEL and BENZOUAK AMINA, Department of computer science, university of Oran, Algeria
THE IMPACT OF MULTI-CRITERIA AID DECISION ON DATA
REPLICATION AND TASK SCEDULING IN GRID COMPUTING (61-73)
Show/Hide Abstract
Grid computing environments have emerged following the demand of scientists to have a very high
computing power and storage capacity. It provides scalable infrastructure for storage resource and
data files management, which supports several large scale applications. One among the challenges
imposed in the use of these environments is the performance problem. To improve performance,
scheduling and replicating techniques are used. In this paper we propose an approach to task
scheduling combined with data replication decision based on multi criteria principle to improve
performance by reducing the response time of tasks and the load of system. This hybrid approach is
based on a non-hierarchical model that allows the scalability.
- SAMIA BOUBAKER, FERID REHIMI and ADEL KALBOUSSI, Laboratoire d’Électronique et de Microélectronique,
Faculté des Sciences de Monastir, Tunisia
MICRO SIMULATION OF ROAD TRAFFIC AND NECESSITY OF VEHICLE
COMMUNICATION TECHNOLOGIES (74-89)
Show/Hide Abstract
In this work, we have extended a car-following model on a single-lane (linear model) to simulate the
lane-changing behavior by MOBIL (“Minimizing Overall Braking Induced by Lane Changes”)
which integrates a politeness factor that controls the possibility to exchange information between
vehicles. Each vehicle can be considered as a sender and receiver of information. We investigate
inter-vehicle communication which enables vehicle to exchange information generated by
microscopic models. There are two basic strategies of information propagation: instantaneous
information can be passed backwards to the opposite direction and in front way to the original
direction. A current vehicle may transfer kinematic variables to a vehicle driving in the opposite and
to original direction. For MOBIL the transmission of information is commanded by a politeness
factor. The principal results of simulation concern the transmission of road traffic information at
microscopic and macroscopic levels. So, it is necessary to provide the communication technologies
which allow an instantaneous information transmission to vehicles and to infrastructure.
- SLIMANI KAHINA, AMEUR ZOHRA and AMEUR SOLTANE, Mouloud Mammeri University, Tizi-Ouzou, Algeria
SEGMENTATION OF BRAIN MRI IMAGES (90-103)
Show/Hide Abstract
This work investigates image of segmentation of MRI (Magnetic Resonance Imaging) brain by the
detector multi scales Canny. The objective is to delineate the outline of a tumor in the MRI images
of the brain reaches a frontal meningioma. Accurate and robust segmentation of brain tissue donated
by MRI is a very important issue in applications, such as surgery and radiotherapy. The multi-scale
Canny edge detector is to use a wavelet transform, which is obtained by orthogonal projection of the
image on the affine space of wavelet basis at different scales, we obtain an approximation space and
retail space. On the basis thereof, the cards of the modules are calculated and used for the extraction
of local maxima. These maxima are decomposition multi scales edges. The result is a set of maxima
at different scales membered that we keep only the most significant contours. This method is
applied to MRI images of a healthy brain as well as images of a brain with a tumor; comparing the
two results the tumor has been localized.
- FATIMA ZOHRA BELLOUNAR and BELABBES YAGOUBI, Department of Computer science, University of Oran, Algeria
DYNAMIC REPLICATION STRATEGY IN PEER-TO-PEER HYBRID SYSTEMS (104-115)
Show/Hide Abstract
The peer to peer technologies have recently experienced a large development in the file
sharing area and are used strongly by the large public. They are an effective method of sharing
resources between people with the same expectations. These systems enjoy a growing reputation,
mainly due to the many applications of distributed collaboration and their specific needs of data
replication, scalability and high availability. Through this paper, we propose a model to provide
users of peer-to-peer systems a good availability of shared data. Our solution is to replicate this data
strategically, by defining the data to be replicated, their number, where they will be placed and to
remove excess data in order to release space storage if necessary. Our strategy takes into account
the data popularity, the volatility and the storage capacity of the different sites.