Software Failure Reduction In Pakistani Software Industry Working On Agile Methods
Agile software development is a group of software development
methodologies which is based on iterative development where the requirements and
solutions evolve through collaboration between self-organizing and crossfunctional teams. Agile has different techniques to develop software in diminutive
duration. Now a day this methodology has becomes more popular as compare to
traditional software development methodologies. Generally, medium scale projects
use agile approaches and large projects uses traditional methodologies like
waterfall model. However, in agile environment; the success rate of small projects
is immense while it is not satisfactory in extensive project. On the other side, water
fall model isn't well known for small projects however it can produce the precise
and explicit result in large projects. Agile methodology has gain phenomenal
success in the entire world including Pakistan. However, during the development
activities, some factors are affected on Pakistani software industry that could be
resolved for better and superior development. So there is need to identify the failure
factors which in unclear in Pakistani software industry and it is yet lack of
guidelines for Pakistani software industry. The purpose of this research is to
identify the failure factors which are faced by Pakistani software industry. It is
important to identify and then solve these factors for better and superior
development in PSI. In this study, we highlight these software failure factors
through systematic literature review and secondly by visiting the selected Pakistani
software industries in distinct locations. Moreover, the analyses of these software
failure factors and then shortlist some of them are the next phase of this research. It
is mix method research which consists of qualitative and quantitative research.
After that, this study contains some comprehensive guidelines with the
collaboration of software professionals for Pakistani software industry. These
proposed guidelines produce the preeminent and applicable results which can be
utilize in Pakistani software industry. In future we can expand this research and
provide a proper framework which could be meet the Pakistani environment.
Furthermore, we can develop a mechanism which can convert the traditional
method to agile in Pakistan.
Keywords: Agile methodology, Scrum, Survey, Analysis, Qualitative study,
Quantitative study.
A HYBRID RANDOM WALK ASSISTED ZONE-BASED CLONE NODE DETECTION PROTOCOL IN STATIC WIRELESS SENSOR NETWORKS
Wireless sensor networks (WSNs) are typically deployed in harsh and insecure environments,
where the sensor nodes are generally unshielded or not tamper-resistance. As a result, WSNs
suffer from such attack known as clone node or node replication attack. This attack is simply
done by physically capturing the legitimate node in the network and then creating a clone with
the same ID of the legitimate node. Moreover, these clones can be re-programmed for internal
attacks, such as black-hole and wormhole attack, DoS attack, extract data from the network,
injecting false data, disconnect the legitimate nodes through voting schemes etc. After taking
control, the adversary created multiple clones on the network for various malicious activities.
Therefore, clone attack is considered extremely effective because cloned devices with real
information are considered real devices that can expose different protocols and sensor
applications. A Systematic Literature Review (SLR) has been developed with the subject matter
through which we identified the most likely solutions to encounter clone attacks are witness node
based techniques. However, they have some notable weaknesses that need to be overcome and
detect clones in a more effective and efficient way.
In view of the disadvantages of existing techniques based on witness nodes, this work
presents a distributed technique called Hybrid Random Walk assisted Zone-Based (HRWZ) for
the detection of clone nodes in static WSNs. The method is based on the Claimer-ReporterWitness (CRW) framework. In HRWZ the network is divided into zones and a random walk
approach called single stage memory random walk has been used for random selection of
claimer, reporter and Zone-Leader nodes, which solved the usual problem of a simple or pure
random walk. In HRWZ, Zone-Leaders are responsible for clone nodes detection locally and
globally in the network. HRWZ is simulated under different settings to compare the clone
detection probability, communication, memory and computation costs, with three witness node
based techniques, such as RM, LSM, and RAWL. The simulation results confirm the improved
performance and reliability of the proposed HRWZ technique. This scheme not only reduces the
communication and storage costs, but also provides an effective method of Zone-Leader
selection for high detection probability of clones.
Keywords: Wireless Sensor Network, Clone Node Detection, Systematic Literature Review,
Challenges
USABILITY SCALE DEVELOPMENT FOR EVALUATING THE QUALITY OF UNIVERSITIES’ WEBSITES IN PAKISTAN
Emergence of the universities’ websites has been revolutionised in the
academic world. And they are becoming a source of benefits for their institutes.
Meanwhile, the tendency of the usage of these websites is increasing among its users.
However, it is observed that universities’ websites are lacking the quality standards
due to which usability is being compromised. It is explored by analysing the current
proposed quality evaluation scales of universities’ websites that, there are many
important usability factors that need to be consider for development of usability
scales. Thus, there is a dire need to raise the quality standards, so that user
experience and satisfaction can be attained by addressing these usability issues. A
systematic literature review was performed to analyze the existing website usability
models and website quality evaluation models. The process was performed based on
previous academic research studies, to identify basic usability factors for the quality
evaluation of universities’ websites. This scale comprises of 11 high-level usability
factors and 31 sub-factors for universities’ websites in 6 categories. Furthermore, to
evaluate the developed usability scale focus group discussion was conducted. Results
of evaluation process have increased the authenticity and accuracy of developed
usability scale. This scale does not only act as a guideline, but it also provides a
roadmap for researchers to improve websites quality by considering necessary
usability factors to raise the standards of universities’ websites. The identified factors
help to increase the users’ experience and satisfaction level that further contribute to
improve the usability level of universities websites. Thus, high level of usability
ultimately raises the quality of universities’ websites.
FOG-ORIENTED SECURE AND LIGHTWEIGHT HEALTHCARE DATA AGGREGATION IN INTERNET OF THINGS
Internet of things (IoT) is becoming an essential research concern because of its broad applicability in the real world. IoT-enabled wireless sensors are deployed to collect the information of patient health. In a healthcare scenario, sensor devices are placed on the body of the patient. Smart healthcare devices securely aggregate healthcare information and forward healthcare data to the base station. Sensor nodes have limited energy, computational, and storage capabilities for communication. Although, several aggregation techniques are utilized to reduce communication costs in healthcare data transmission. However, secure and lightweight data transmission of healthcare data is the main concern. This thesis explored the existing secure data aggregation schemes and presents an appropriate solution for challenging issues of existing schemes. This thesis presents an Efficient and Secure Data Transmission and Aggregation (ESDTA) scheme that provides secure and lightweight data transmission. A secure message aggregation (SMA) algorithm is employed to aggregate data at Mobile nodes (MN). Healthcare parameter values are aggregated by using the colon as a delimiter. Moreover, a secure message decryption (SMD) algorithm is employed at the fog node (FN). The proposed algorithm provides lightweight and secure data transmission by applying symmetric key-based data encryption and removing redundant healthcare parameter values from transmitted data. The simulation scenario for ESDTA proposed scheme is implemented through simulation tool NS2.35. We have compared ESDTA with existing related studies EHDA, SPPDA, APPA, and ASAS. The proposed scheme is compared with the related research studies and provide 23% better communication cost in terms of bytes exchange, 19% better computational cost, 15% better energy utilization, 57% better storage, and 32% less number of compromised bytes. Hence, results prove the sovereignty of the proposed work
Artificial Bee Colony based Optimization for Data Sharing in Internet of Things
Internet of Things (IoT) comprises of complicated and dynamical aggregation of smart units that normally need decentralized command for data sharing across the networks. The most popular swarm intelligent techniques artificial bee colony (ABC) is inspired from collective actions of honey bees that can be used for solving problems during clustering in large scale data of IoT. The main problem is that each food source is compared with every other food source in neighborhood to determine the best global food source. It requires unnecessary comparisons to compare the pair of poor quality food sources as well. It results in consuming more utilization time, slow convergence speed and increased delay. This work presents an enhanced ABC (E-ABC) based optimization for data collection and replication mechanism. E-ABC improves the previous ABC algorithm by reducing the unnecessary comparisons. E-ABC compares the best source with the available sources which will excludes the comparison of poor resources. The proposed E-ABC algorithm was applied on replica selection to prove its supremacy as compared to counterparts in terms of convergence speed, data availability and response time. Results show the supremacy of proposed E-ABC over previous algorithms. The proposed algorithm provides 65% better response time from DCR2S and 20% better than MOABC when number of cloudlets are 1000. The file availability probability for E-ABC is noticed as 85% when total cost is 20. Additionally, some open research challenges are highlighted on the basis of literature which will help the researchers to find the research gap with respect to IoT and ABC.
PEMD: PRIORITISED EMERGENCY MESSAGE DISSEMINATION IN VEHICULAR SOFTWARE DEFINED NETWORKS
Vehicular Software Defined Networking (VSDN) has attained implausible consideration because of their applications in traffic engineering, network intelligence and security services. However, due to high mobility, dense traffic and limited time of communication between vehicles, designing a reliable emergency dissemination strategy in VANETs that can minimize transmission delay to meet the needs of delay-sensitive applications is critical and challenging. Moreover, existing methods lack a reliable software-defined mechanism for scheduling and dissemination of emergency messages based on different severity levels. In this work, we propose a novel four class priority emergency packet scheduling method in VSDN named as prioritised emergency message dissemination (PEMD) to provide real-time data services in vehicular networks based on cooperative decisions. It mainly consists of four priority classes, medium, high, very high and extremely high. When a packet arrives at RSU, the policy based multifold classifier (PMF), classifies arrived packet as emergency or normal packet. Medium, high priority, very high priority, and extremely high priority data packets are considered as real-time, while normal packets are considered non-real-time data packets. The performance of the proposed method is analyzed by NS3.29 simulation which depicts upgrading in service delay, service ratio, deadline miss ratio, packet transmission, and network scalability as compared to state-of-the-art methods such as FCFS, EDF, ADPS, and SDN-controlled VNDN.
Incentive Mechanisms for Cooperative tasks in mobile crowdsensing
Nowadays Mobile Crowdsensing System (MCS) is served as a building block for the emerging Internet of Things (IoT) applications. A lot of work has been done in the designing of Incentive Mechanism. However, in existing mechanism the performance of workers neglected during recruitment of new workers or termination of existing workers which lack the interest of worker. The main problem is that rewards paid to the participants of performing sensing task are fixed whether the task is easy or extensive. It enhances the interest of participants to earn same amount with easy task and avoid the extensive tasks which results in less competition due to less number of participants for extensive task. In this work, an incentive mechanism based on Reverse Auction with Dynamic Price, in which the total cost of performing task is not fixed and participants awarded with rewards on the basis of difficulty level of task present. System also selects the potential candidate on the basis of their contribution to the system. Firstly, the auctioneer gives bids then the platform selects the lowest bidder. The aim of this design is to give reward of existing workers in the form of lottery or bonus. The proposed design also gives inner lottery to the terminating workers on the basis of their contributions to the system. Present detailed design of Dynamic Price incentive mechanism which is based on Reverse Auction Process. The performance of proposed system model is examining by developing testbed for evaluating and analyzing the datasets and a simulation performs for the collection of data from sensing devices, with respect to metrics such as number of workers, rewards, level of tasks and rank of users. Results are compared for participation of workers in each cell, rewards and the total cost for the task with the existing RAB and RADP scheme.
A empirical study on perceived effectiveness of sustainable design of mobile app icons
Sustainability has emerged as an important theme in society. And it is also becoming a source of benefit for HCI. However, the sustainability principles and the practice of sustainability in many areas like sustainable graphic design are at their starting stage. It has been explored by analyzing the currently proposed characteristics for app icons that, there are many important sustainable design characteristics that need to be considered while designing play store app icons. Thus, there is a dire need to raise the issues regarding sustainable interaction designs, so that user experience and satisfaction can be attained by addressing these issues. The narrative literature review was performed to analyze the existing characteristics that can relate to sustainable design and their effect on app icons. The process of NLR was performed based on previous academic research studies to identify the characteristics for the sustainable design of mobile app icons. Six characteristics related to sustainable design were identified. Furthermore, to evaluate the perceived effectiveness of identified characteristics of app icons among users, a controlled experiment was conducted. Results of the controlled experiment have increased the authenticity and effectiveness of identified characteristics. For designing stable and effective app icons identified characteristics will be helpful. Moreover, identified characteristics will help to increase user experience and satisfaction level that further contribute to improve the effectiveness of app icons. Thus the sustainable design of mobile app icons raises the perceived effectiveness among users.
Factors affecting crowd participation time in crowsourcing contest
The crowd is generally considered as a group of people who gather for the same cause but are not generally related or interconnected to each other. While a software crowdsourcing contest can be regarded as one of the newly invented and highly innovative modes of crowdsourcing, it is also considered to be one of the highly accepted setup for an organization to announce and implement an open call of their desired task online. Crowd participation time is one of the main reason that plays an essential role in completion of the task. Researches based on solver participation in crowdsourcing contests proved to be helpful in comprehension and manageability of the motivations of solvers to take part in the online software crowdsourcing platform. So, what could be the strategy to attract more participants to participate in the contest and motivate them to put in more effort and time, is the main focus of this research work. Previous studies have been found to measure the submission rate in order to figure out the rate of solvers participation in software contest. They are lacking in formulation of the feasible suggestions on the average rate of participation effort, payout by a participant in software development tasks such as bug fixing and interface evaluation. This research has been conducted through SLR followed by the expert review approach. SLR has been conducted to extract factors that affect participation time from existing literature, and subsequently, the identified factors went through an expert review for validation of these identified factors. A list of factors has been identified that will be helpful for industry practitioners and for academicians to update their researches in the field of crowdsourcing. Monetary rewards, communication and coordination, task understandability, and task documentation were found to have considerable impact on participation time in software crowdsourcing contests. The research opens the gate for futuristic research, seeking ways to formulate how crowd participation of web developers and mobile app developers is affected.
A CONCEPTUAL MODEL FOR EARLY DETECTION OF FAKE NEWS
In the era of technology and digital media, the stormy interaction and massive spread of information have increased the significance of the need for credible information. The concept of fake news or forged news in that regard is not new and its ultimate profound impact on the addressed audience. This malicious act causes discomfort, character assassination, privacy breach, and defamation of the targeted audience. Such news is endorsed to disrupt society’s normal functioning. Fake news due to its persuading terminologies and factors tends to destroy the openness to truth seeing. It interrupts the normal thinking process of the targeted audience and they end up having a typical or tuned mindset which ignites violence in society. Reviewed research depicts that automated detection of fake news has always been the prime focus whose authenticity according to the researchers’ community, however, is still questionable. It is important to understand that automation without unfolding the core constructs based on which news is labeled as fake can never be relied as the pattern of news dispersion and creation changes with time or invention in technology. Moreover, manual detection has correspondingly added value to the existing research in terms of the detection of fake news. However, it is considered a costly and tiresome task. It is also notable that the present research is ignoring the fact that what makes news a fake news.
The need of the hour is to make an effort to carry the focus to the constructs contributing or labeling to the detection of fake news at early stages based on the previous and recent state of knowledge. Furthermore, a conceptual model to standardize the detection process based on verified contributing core constructs needs to be developed. The objective of this research is to identify the constructs, classify and categorize news for the detection of fake news. Thus, this research contributes a conceptual model encompassing different core constructs contributing to the early detection of fake news from the point it originates and disperses. On that account, a systematic literature review methodology is conducted to extract constructs from existing literature along with implicit and explicit removal. Subsequently, the data coding technique of grounded theory is applied for encoding the extracted data. Lastly, expert reviews have been conducted for the validation of that proposed conceptual model encompassing core constructs contributing to the propagation and dispersion of fake news. Resultantly, a total of 74 constructs are identified which are further grouped into 15 categories. This research will eventually help data-scientist to label the news as fake or real based upon the recognized, verified constructs.
Pipeline hazard monitoring and reporting in linear wireless sensor networks
Linear Wireless Sensor Network (LWSN) is a sensor deployment technique in which sensors are deployed or placed in a structure which is in-line or in a linear form known as linear infrastructure. Pipeline hazards like damage or leakage are supervised by the usage of LWSN, and reported to the control center or base station. Further suitable actions are required to minimize or sort out the reported issue, as well as to save the underwater environment, habitat and economy. The main issue with the Autonomous Underwater Vehicle (AUV) is that it traverses to every sensing node for the collection of data which causes energy loss. This work gives a variable speed-based and fixed speed-based collection of data from sensing devices in which nodes are placed, it includes relay nodes as well as data distribution nodes. It reduces the communication cost in the data collection process for the variable speed and fixed speed AUV scenario. AUV gathers data from responsible data distribution nodes, which receive data from low power devices. Furthermore, in this work the simulations are performed in NS2 for validation. The proposed scheme provides encouraging results in terms of delay, communication cost, energy consumption, buffer size, and delivery ratio when compared with different algorithms.
BOUNDARY DETECTION USING CONTINUOUS OBJECT TRACKING IN IOT ENABLED WIRELESS SENSOR NETWORKS
Internet of Things (IoT) has attained implausible consideration in today’s era because of their enormous applications in different fields such as environmental perception, military observing, predictive maintenance and industrial applications. IoT approach, provide numerous considerable advantages to various application domains. IoT acquired radiant consideration throughout the ongoing years on account of arise sort of applications that allows tracking and monitoring. The most transcendent applications offer confinement and detection of continuous objects for example wild fire, toxic gas, mud stream, oil spills, wild fire and so forth. Continuous objects are detected to investigate the boundary of hazardous area and alert the staff for safety. Existing studies lacks accurate, energy efficient and delay minimized boundary detection mechanism for continuous objects. In emergency situation detecting accurate boundary of continuous objects has become note worthy challenge, where reducing the delay and minimizing energy consumption are well thought out as first-class citizens. This work proposes a novel mechanism for detecting the accurate boundary of continuous objects in a fog oriented environment using IoT enabled devices to tackle delay related issues and also maximizing energy efficiency. To avoid high latency rate in communication with cloud computing, a grid based scheme is applied for detecting accurate boundary region of continuous objects. To reduce the energy and latency rate our technique requests only grid’s cluster head for making decisions and fog node estimate the diffusing region of object. The propose work implement through simulation in NS-2. Experiment results show that we get better boundary detection while reducing the transmission delay and energy consumption by comparing to state of the art strategies
Cloud detection in remote sensing images using deep learning
Remote sensing images play a vital role in the analysis of the earth’s surface. The earth analysis is productive if the sky is clear because clouds obscure the earth surface and create problems for remote sensing applications such as change detection, agriculture, surveillance, urban and rural planning. Various methods have been proposed for the detection of clouds, which vary from pixel intensity transformation based methods to deep learning methods. The intensity transformation methods are generally fast but they are susceptible to the variation in the pixel intensities, illumination changes and noise. On the other hand, the deep learning methods are efficient and accurate but require training on dataset(s) prior to cloud detection. In this thesis, You Only Look Once (Yolo) algorithm is investigated for the cloud detection. Yolo has been successfully applied for the detection and recognition of real life objects in indoor and outdoor images. In this thesis, the Yolo algorithm is combined with other three state of the art deep learning algorithms in order to improve its accuracy for cloud detection. These algorithms are Practical Portrait Human Segmentation Lite (PP-HumanSeg_Lite), Deep Dual Resolution Network (DDRNet) and Disentangled Non-Local Network (DNLNet). All these algorithms have been used for the semantic segmentation of the real life objects. The combination of Yolo with PP-HumanSeg_Lite, DDRNet and DNLNet is done through an ensemble learning method where the responses of Yolo and the other algorithms are combined and provided to Random Forest for accurate cloud detection. Experiments are performed on two different cloud datasets which are High Resolution Cloud Detection (HRCD) Dataset and 38-Cloud Segmentation datasets. The experimental result shows that Yolo+PP-HumanSeg_Lite give the best results and achieves accuracy of 96% and 93% on HRCD and 38-Cloud datasets, respectively. Whereas, Yolo achieves 91.2% and 81.5% accuracy, respectively.
INTRUSION DETECTION USING DEEP LEARNING IN IOT-BASED SMART HEALTHCARE
Title: Intrusion Detection using Deep Learning in IoT based Smart Healthcare
The rapid increase and implementation of Internet of things (IoT) based technologies in healthcare have made a significant contribution to the global network. Despite bringing useful benefits all over the globe such as real-time monitoring of patients’ information and diagnosing properly whenever needed, Internet of things (IoT) based systems appear to be an easy target for intruders. As the number of threats and attacks against IoT devices and services rapidly increases, the security of Internet of Things (IoT) in healthcare has become more challenging. In order to meet this challenge, hybrid learning based effective Intrusion Detection in IoT needs to be developed. In this study, we propose a novel hybrid model for intrusion detection in IoT based smart healthcare using RF, SVM, LSTM and gradient boosting. We proposes generalized model by handling the problems of overfitting and underfitting. We generates a new feature to make the proposed model more effective for detecting intrusion in IoT. We study the performance of proposed model in multi classification using MQTT-IOT-IDS 2020 dataset, a latest dataset with IoT network traces and compared the performance with different ML and DL algorithms. Experimental results show that our model performs better intrusion detection than other DL and ML algorithms.