Intelligent Assignment Engine by Genetic Algorithm

8 downloads 0 Views 5MB Size Report
May 8, 2012 -
CS 493 Graduation Project

Intelligent Assignment Engine by Genetic Algorithm

Supervised by Dr.Abdullah Alsheddy

Done by Ebtehal Turki Sahow Alotaibi Entesar Alonizi Fatmah Mohammed Jaddoh Montaha Ali Aaba-alkhail Shahad saleh Alqefari

University Of Al-Imam Muhammad Bin Saud Islamic University Faculty of Computer Sciences Department of Computer Science and Information Systems May. 8, 2012 Riyadh ©2012 All rights reserved

i

Abstract The tasks distribution have always been a difficult work in many areas and need a complex system which operates under multiple constraints, including employee knowledge , preferences and others .It is not an easy task to generate a satisfactory solution to meet the needs of all related factors. Therefore, it becomes an extremely difficult task to obtain a strategy that is inclusive of all critical issues in distributing tasks. The task assignment problem discussed under multiple constraints and can be solved by using the metaheuristics techniques which are a powerful algorithmic approaches that have been applied with great success to many difficult combinatorial optimization problems by trying to find sub-optimal/ optima solutions in a reasonable time. However the Parallel meta-heuristics is more powerful tha sequential version which aims to solve larger problem instances in reasonable computing times. In appropriate settings such as co-operative multithreading and parallel Metaheuristic also prove to be much more robust than sequential versions in dealing with differences in problem types and characteristics. They also require less extensive, and expensive, parameter calibration efforts. particularly, we are going to use a genetic algorithm method to deal with the issue of multiple constraints, which also considered as a global optimal searching method that can save significant time spent on task assignments and are more acceptable by the employees.

ii

Acknowledgements Special thanks should be given to our supervisor Dr. Abdullah Al-sheddy for his recommendations and suggestions have been invaluable for the project.And words alone cannot express the thanks we owe to Dr.Manal Taoufiki and T.Wojdan Al-Saeedan , for their encouragement and assistance.

iii

Contents Abstract

i

Acknowledgements 1

2

3

4

Introduction 1.1 Introduction . . . . . 1.2 Motivation . . . . . . 1.3 Problem statements . 1.4 Aim and Objective . 1.5 Research Scope . . . 1.6 Project Plan . . . . . 1.7 Report Organization

ii

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

1 1 1 2 2 3 3 4

Literature Review 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2.1 Combinatorial problems . . . . . . . . . . . . . . . . 2.2.2 Metaheuristics . . . . . . . . . . . . . . . . . . . . . 2.2.2.1 Exact and Metaheuristics method . . . . . 2.2.2.2 Trajectory and population Metaheuristics 2.2.3 Genetic Algorithms . . . . . . . . . . . . . . . . . . . 2.3 Related work . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

. . . . . . . . .

6 6 6 6 6 7 7 7 13 16

Problem Formulation 3.1 Problem Description . . . 3.1.1 Overview . . . . . 3.1.2 Assumptions . . . 3.1.3 Given Data . . . . 3.2 Mathematical formulation 3.2.1 Normalization . . . 3.3 Problem Structure . . . . . 3.4 Summary . . . . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

18 18 18 18 19 19 21 22 22

. . . . . .

24 24 37 38 40 44 44

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

. . . . . . . .

Applying GA on Assignment Problem 4.1 Algorithms Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Experimental . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2.1 Comparison of IAE Algorithm and literature Algorithm . . . . . . 4.2.2 Comparison of IAE problem model and Literature problem model 4.2.3 Genetic Algorithm Operator Test . . . . . . . . . . . . . . . . . . . . 4.2.4 Genetic Algorithm Parameter Setting . . . . . . . . . . . . . . . . .

. . . . . .

. . . . . .

. . . . . .

CONTENTS

iv

5

46 46 47 48 50

6

Cooperative Parallel Genetic Algorithm on Assignment Problem 5.1 Independent Parallel Genetic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 5.1.1 IndePGA Flowchart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1.2 Comparison of IndePGA and GA . . . . . . . . . . . . . . . . . . . . . . . . 5.2 Cooperative Parallel Genetic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 5.2.1 Asynchronous Parallel Distributed Genetic Algorithm with Elite Migration (ACoPGA) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2.1.1 Situations for Communication between Elite Server Communicates and Subpopulation Client: . . . . . . . . . . . . . . . . . . . 5.2.1.2 Elite server flowchart . . . . . . . . . . . . . . . . . . . . . . . . . 5.2.1.3 Subpopulation client Flowchart . . . . . . . . . . . . . . . . . . . 5.2.1.4 Comparison between IndePGA and ACoPGA: . . . . . . . . . . 5.2.2 Synchronous Migration (SCoPGA): . . . . . . . . . . . . . . . . . . . . . . 5.2.2.1 Synchronous Migration flowchart . . . . . . . . . . . . . . . . . . 5.2.2.2 Comparison between IndePGA and SCoPGA . . . . . . . . . . . 5.2.3 Merged Cooperative Parallel Genetic Algorithm(MCoPGA) . . . . . . . . 5.3 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.3.1 Summary of IndePGA parameter Setting . . . . . . . . . . . . . . . . . . . 5.3.2 Summary of ACoPGA Parameters setting . . . . . . . . . . . . . . . . . . . 5.3.3 Summary of SCoPGA Parameters settings . . . . . . . . . . . . . . . . . . System Analysis and Design 6.1 Introduction . . . . . . . . . . . . . . . . . . 6.2 System Analysis . . . . . . . . . . . . . . . . 6.2.1 Overall Description . . . . . . . . . . 6.2.1.1 Product Perspective . . . . 6.2.1.2 Product Features . . . . . . 6.2.1.3 User Characteristics . . . . 6.2.1.4 Operating Environment . . 6.2.2 Specific Requirements . . . . . . . . 6.2.2.1 Functional Requirements . 6.3 System Design . . . . . . . . . . . . . . . . . 6.3.1 ClassDiagram . . . . . . . . . . . . . 6.3.2 Intelligent Assignment Flowchart . 6.3.3 Data Design . . . . . . . . . . . . . . 6.3.3.1 ER Diagram . . . . . . . . 6.3.3.2 Database Schema . . . . . 6.3.3.3 Description OF Attributes 6.3.4 System Interfaces . . . . . . . . . . . 6.4 System testing . . . . . . . . . . . . . . . . . 6.4.1 Unit Testing . . . . . . . . . . . . . . 6.4.2 User Input Testing . . . . . . . . . . 6.4.3 Integration And Regression Testing: 6.4.4 User Acceptance Testing . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

50 51 52 53 54 55 56 57 58 59 59 59 60 62 62 62 63 63 63 64 64 65 65 81 82 83 84 84 85 86 88 108 108 109 125 125

7

Conclusion And Future Works

127

A

129 A.1 Data Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129

B

134 B.1 Gantt Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134

CONTENTS

v

C C.1 C.2 C.3 C.4

Genetic ALGORITHMS DESIGN . . . . . . . . . . . Genetic Algorithm Operator Test . . . . . . . . . . . Genetic Algorithm Parameter Setting . . . . . . . . . parallel Genetic Algorithm design . . . . . . . . . . C.4.1 Elite Server Algorithm . . . . . . . . . . . . . C.5 PGA parameter setting . . . . . . . . . . . . . . . . . C.5.0.1 Synchronous Migration Algorithm D

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

136 136 148 152 159 161 164 174

177 D.1 Intelligent Assessment System Test . . . . . . . . . . . . . . . . . . . . . . . . . . . 177 D.1.1 Engine Subsystem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177

Refrences

194

vi

List of Figures 2.1

SA sequence solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

14

4.1 4.2 4.3

IAE algorithm and YZW algorithm Representation . . . . . . . . . . . . . . . . . . IAE model with YZW model Error Bar (case 1) . . . . . . . . . . . . . . . . . . . . IAE model with YZW model Error Bar (case 2) . . . . . . . . . . . . . . . . . . . .

39 42 43

5.1 5.2 5.3 5.4 5.5 5.6 5.7

PGA and GA Comparison Error Bar . . . . . . . . . . . . IndePGA and GA Comparison Results Error Bar . . . . . Asynchronous Elite Migration . . . . . . . . . . . . . . . . ACoPGA and PDGA fitness value . . . . . . . . . . . . . Synchronous Migration . . . . . . . . . . . . . . . . . . . Fitness Values Representation for SCoPGA and IndePGA Fitness Values Representation for all approaches . . . . .

48 49 50 54 55 57 58

6.1 6.2

administrator interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Teacher interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

C.1 C.2 C.3 C.4 C.5 C.6 C.7 C.8 C.9 C.10 C.11 C.12 C.13 C.14 C.15 C.16 C.17 C.18 C.19 C.20 C.21 C.22 C.23

Selection Strategies Results Error Bar . . . . . . . . . . . . . . . . . Crossover split points Error Bar . . . . . . . . . . . . . . . . . . . . Mutation Mechanisms Error Bar . . . . . . . . . . . . . . . . . . . Replacement Mechanisms consumed time . . . . . . . . . . . . . . Replacement Mechanisms Error Bar . . . . . . . . . . . . . . . . . Generations-Time Representation . . . . . . . . . . . . . . . . . . . Generation Numbers and Error Bar . . . . . . . . . . . . . . . . . . Chromosomes-Time Representation . . . . . . . . . . . . . . . . . Chromosome Numbers and Error Bar . . . . . . . . . . . . . . . . Tournament sizes and Error Bar . . . . . . . . . . . . . . . . . . . . Crossover Rates and Error Bar . . . . . . . . . . . . . . . . . . . . . Mutation Rate and Error Bar . . . . . . . . . . . . . . . . . . . . . . Generation Numbers and Error Bar . . . . . . . . . . . . . . . . . . chromosomes numbers and Error Bar . . . . . . . . . . . . . . . . Crossover Rates and Error Bar . . . . . . . . . . . . . . . . . . . . . Mutation Rates and Error Bar . . . . . . . . . . . . . . . . . . . . . Tournament Sizes and Error Bar . . . . . . . . . . . . . . . . . . . . Thread Sizes and Error Bar . . . . . . . . . . . . . . . . . . . . . . . Interval Numbers and Fitness Values - Average Representation . Initial Longevity and Fitness Values - Average Representation . . Number of migrants and Fitness Values - Average Representation Migration interval and Fitness Values Representation . . . . . . . Migration rate and Fitness Values Representation . . . . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

148 149 150 151 151 153 153 154 154 156 157 158 164 165 166 167 168 169 171 172 173 175 176

vii

List of Tables 1.1

Project Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4

4.1 4.2 4.3

comparing IAE algorithm and Yen-Zen Wang algorithm . . . . . . . . . . . . . . comparing IAE model with YZW model (case 1) . . . . . . . . . . . . . . . . . . . comparing IAE model with YZW model (case 2) . . . . . . . . . . . . . . . . . . .

39 42 43

5.1 5.2 5.3 5.4 5.5

PGA and GA Comparison Results . . . . . . . . . . . . . PGA and GA Comparison Results . . . . . . . . . . . . . ACoPGA and PDGA fitness value . . . . . . . . . . . . . Fitness Values Representation for SCoPGA and IndePGA Fitness Values Representation for all approaches . . . . .

48 49 54 57 58

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

A.2 Course offered records . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133 A.3 Minimum teaching requirement and teaching willingness of each teacher . . . . 133 C.1 C.2 C.3 C.4 C.5 C.6 C.7 C.8 C.9 C.10 C.11 C.12 C.13 C.14 C.15 C.16 C.17 C.18 C.19 C.20

Selection Strategies Results . . . . . . . . . . . . . . . Crossover split points Results . . . . . . . . . . . . . . Mutation Mechanisms Results . . . . . . . . . . . . . . Replacement Mechanisms Results . . . . . . . . . . . Generation Numbers and Fitness Values . . . . . . . . Chromosome Numbers and Fitness Values . . . . . . Tournament Sizes and Fitness Values . . . . . . . . . . Crossover Rates and Fitness Values . . . . . . . . . . . Mutation Rates and Fitness Values . . . . . . . . . . . Generation Numbers and Fitness Values . . . . . . . . chromosomes numbers and Fitness Values . . . . . . Crossover Rates and Fitness Values . . . . . . . . . . . Mutation Rates and Fitness Values . . . . . . . . . . . Tournament Size and Fitness Values . . . . . . . . . . Thread Sizes and Fitness Values . . . . . . . . . . . . . Interval Numbers and Fitness Values . . . . . . . . . . Initial Longevity and Fitness Values . . . . . . . . . . Number of migrants and Fitness Values . . . . . . . . Migration interval and Fitness Values Representation Migration rate and Fitness Values Representation . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

148 149 150 151 152 154 156 157 158 164 165 166 167 168 169 170 172 173 175 176

1

Chapter 1

Introduction 1.1

Introduction

Combinatorial problems involve finding the optimal grouping, ordering, or assignment of a discrete, finite set of objects that satisfies given conditions. Task- employee assignment problem is considered as combinatorial problem in term of finding the optimal / or suboptimal assignment, the system has to assign the tasks to the employee according to the constraints in optimal way .Since optimization is crucial Metaheuristic optimization techniques that frequently used in such problem. Metaheuristic method is formally defined as an iterative generation process which guides a subordinate heuristic by combining intelligently different concepts for exploring and exploiting the search space, learning strategies are used to structure information in order to find efficiently near-optimal solutions. One of the Metaheuristic method is a Genetic Algorithm (GA) which we are going to use in our project and is defined as population Metaheuristic method Were a multiplicity of solutions evolve concurrently. GA tries to imitate the development of new and better populations among different species during evolution, just as their early biological predecessors. Unlike most standard heuristic algorithms, GAs use information of a population of individuals (solutions) when they conduct their search for better solutions as opposed to only information from a single individual Using GA by applying them in a sequential to solve this kind of problem specially when we have a huge search space for a specific problem is unproductive. However applying GA in parallel is more powerful than a sequential version in such a problem, although their utilization allows to significantly reducing the temporal complexity of the search process, this latter remains high for task-employee assigning problem domains. Therefore, parallelism comes as a natural way not to only reduce the search time, but also to improve the quality of the provided solutions.

1.2

Motivation

Intelligent Assignment Engine motivated us in many ways. Since many organizations operate by setting goals and then setting action to achieve these goals,Helping the organization for such decision-making system in optimal or/suboptimal way not an easy task. So it is an exciting area and posing constantly new challenges.

CHAPTER 1. INTRODUCTION

2

On other hand Intelligent Assignment Engine can facilities the assigning process for a huge organization with a lot of workers in many terms of robustness, speed and quality. Also many organizations tend to computerize the process of assigning tasks, so doing such a project can be a great opportunity for us in the commercial side.

1.3

Problem statements

One type of real life combinatorial problem is task-employee assignment problem, it is a complex problem in terms of making the best/not conflict decision, intelligent assignment engine is a general system which can be used in any organization that helps the administrator to assign the task to the employees and maintain the constrains in fair and optimal/suboptimal. way. The Engine contains two sub problems, constrains satisfaction problem ( assignment problem ) and optimization of the final CSP solution, each of employee has his preferences, specialization, skills, and his own information, system will assign each task to a specific expert based on the defined constrain, then it will evaluate this assignment to be the optimal. Such that systems need to high degree of performance in terms of computation time, robustness and/or quality of solution, using the traditional version of independent parallel Metaheuristic algorithm will find the optimal task-employee assignment and improve the calculation time of solution. We will implement a cooperative/dependent version of parallel Metaheuristic algorithm to enhance the overall performance (quality of solution besides computation speed), particularly we are going to use the genetic algorithm as a population heuristic. As a case study for task-employee assignment problem we will apply it on teacher-course assignment problem .The problem addressed is the assignment teacher to courses which has many sections at university level .The assignment should consider faculty preferences , experience ,skills and other constraints which influence the assignment problem. Due to the varied nature and the complexity of the problem, it is difficult to find a general procedure to solve such problem. The objective is to optimize the model to maximize faculty-course preferences in assigning faculty members and take into consideration other factors .

1.4

Aim and Objective

Intelligent Assignment System aims to create highly productive organization , employees are the most valuable asset to any organization, keeping them engaged and motivated is the key to a successful organization, taking the willingness of the employees into consideration and also the desire of the manger to have the employee work in the task they are qualified for and to distribute the work load evenly are the most important aims to our engine. Objectives: 1. Formulate the problem to achieving a great understanding to it. 2. Using GA since optimization is crucial factor in this problem in order to find efficiently near-optimal /optimal solutions. (a) Choosing an optimal parameters and degree setting for all GA operators until they achieve adequate performance for the given problem. (b) Design and validate a scale to measure our work with others.

CHAPTER 1. INTRODUCTION

3

3. Applying parallel GA larger to maximize the solution cost in reasonable computing times. Also to provide more robust solution than sequential versions in dealing with differences in problem. (a) Set an optimal parameters setting for parallel GA operators until they achieve adequate performance for the given problem. (b) Comparison between applied methods the parallel genetic algorithm and genetic algorithm version 4. Implementing cooperative/dependent version of parallel genetic to enhance the overall performance and quality of solution besides computation speed. (a) Set an optimal parameters setting for cooperative parallel GA operators until they achieve adequate performance for the given problem. (b) Compare different version of GA performance. 5. Design a system and applied on real life problem. So it will serves both manger and employee by assigning the task to the qualified person, satisfy employee willingness considering the fairness in the number of the assigned task for the employee and distribution of the task evenly.

1.5

Research Scope

The scope of this project is confined to the work of a task assigning problem, a parallelism and optimization by using GA. Optimization and task assigning problem is going under artificial intelligent filed, parallelism and cooperative algorithm work is going under computer operating system filed. As with all studies, there is the problem of finding good and relevant information.

1.6

Project Plan

Task Gathering resources(time tabling/GA/TSP/parallelism/ processors allocating/search space dividing /threading/data exchanging/local and global optimal/Synchronization and A Synchronization/ control the particles/ independent and cooperative search) Case 1 Search and reading about (GA/TSP) Gathering task-employee assignment (teacher-courses assignment )problem data, information, and requirements Formulate the problem to meet algorithm requirements(define the search space/populations/individuals/individuals representation/reproduction method /fitness function/termination condition of T-C assigning problem). Design problem requirements (DB,GUI,. . . ) Rewrite the algorithm with the defined problem instances. Coding the algorithm (to programming language). Run the code and calculate the execution time. Maintain the errors (if any) Documenting case 1 results’.

Type Problem specification

Problem specification

Design Implementation Test Maintenance Documentation

CHAPTER 1. INTRODUCTION

Case 2 Search and reading about(independent search/processors allocating/threads). Design two version of GA algorithm with problem instances to be work independently. Coding the algorithms, without data sharing. Run the two’s versions on one processor and calculate the time. (test) Maintain the errors (if any) Documenting case 2 results’. Case 3 Search and read about(parallel implementation/search space dividing/threading/data exchanging/local and global optimal/ Synchronization and A Synchronization/ control the particles/ cooperative search) Design the parallel method layout. Divide the search space. Design the algorithm with the problem instances to be work dependently, with data sharing. Coding the twos version. Allocate each version into specific processor Run the two’s versions on the processors and calculate the execution time. Maintain the errors (if any) Documenting case 3 results’. Table 1.1: Project Plan

1.7

4

Problem specification Design Implementation Test Maintenance Documentation Problem specification

Design

Implementation Test Maintenance Documentation

Report Organization

The remainder of the documentation is organized as follows. Chapter 2 presents project literature view and its background. Chapter 3 formulates the teacher assignment problem mathematically. Chapter 4 discusses GA solution. An improvement of the GA solution by Parallelism in Chapter 5. Chapter 6 discusses the GA application analysis and design. Finally chapter 7 provides concluding remarks.

CHAPTER 1. INTRODUCTION

5

6

Chapter 2

Literature Review 2.1

Introduction

Most optimization tasks found in real-world applications impose several constraints that usually do not allow the utilization of exact methods. The complexity of these problems (they are often NP-hard or the limited resources available to solve them (time, memory) have made the development of metaheuristics a major field in operations research. In these cases, metaheuristics provide optimal or suboptimal feasible solutions in a reasonable time. Although the use of metaheuristics allows to significantly reduce the time of the search process, the high dimension of many tasks will always pose problems and result in time-consuming scenarios for industrial problems.

2.2 2.2.1

Background Combinatorial problems

Combinatorial problems arise in many areas of computer science and application, such as scheduling, planning, design, and the natural sciences. It can be classified into two classes: studying conceptually simple problems facilitates development, analysis and presentation of algorithms. In a combinatorial problem, variables have to be assigned values so that some constraints on these variables are satisfied and, optionally, that some cost expression on these variables takes a maximal or minimal value. A solution is such an assignment. Many solver programs exist for combinatorial problems, performing either complete search or local search.

2.2.2

Metaheuristics

Metaheuristics are powerful classes of optimization techniques that have gained a lot of popularity in recent years. These techniques can provide useful and practical solutions for a wide range of problems and application domains. It basically tries to combine basic heuristic methods in higher level frameworks aimed at efficiently and effectively exploring a search space. The power of metaheuristics lies in their capability in dealing with complex problems with no or little knowledge of the search space, and thus they are particularly well suited to deal with a wide range of computationally intractable optimizations and decision-making applications. Typically , metaheuristics are high-level strategies which guide an underlying, more problem specific heuristic, to increase their performance. The class of metaheuristic algorithms includes but is not restricted to-ant colony optimization (ACO), evolutionary computation (EC)including genetic algorithms (GAS), iterated local search (ILS), simulated annealing (SA), and tabu search (TS).

CHAPTER 2. LITERATURE REVIEW

2.2.2.1

7

Exact and Metaheuristics method

The available techniques for solving Combinatorial optimization problems (COPs) can roughly be classified into two main categories: exact and metaheuristics method. Exact algorithms are guaranteed to find an optimal solution and to prove its optimality for every instance of a COP. Exact algorithms allows to find exact solutions but they are impractical for solving large problems as they are extremely time-consuming. ”The run-time, however, often increases dramatically with the instance size, and often only small or moderately-sized instances can be practically solved to provable optimality. In this case, the only possibility for larger instances is to trade optimality for run-time, yielding metaheuristics algorithms. In other words, the guarantee of finding optimal solutions is sacrificed for the sake of getting good solutions in a limited time” [58].The use of metaheuristics generally meets the needs of decision makers to efficiently generate satisfactory solutions. 2.2.2.2

Trajectory and population Metaheuristics

There are different ways to classify and describe metaheuristic algorithms. Depending on the characteristics selected to differentiate among them, several classification are possible, the most important characteristic that can be used for the classification of metaheuristics is the number of solutions used at the same time: Does the algorithm work on a population or on a single solution at any time? Algorithms working on single solutions are called trajectory methods because the search process performed by these methods is characterized by a trajectory in the search space. ”Trajectory metaheuristic like Tabu Search, Iterated Local Search and Variable .Neighborhood Search. They all share the property of describing a trajectory in the search space during the search process” [51]. Most of these methods are extensions of simple iterative improvement procedures, whose performance is usually quite unsatisfactory . They incorporate techniques that enable the algorithm to escape from local minima. This implies the necessity of termination criteria other than simply reaching a local minimum. Commonly used termination criteria are a maximum CPU time, a maximum number of iterations, a solutions of sufficient quality, or reaching the maximum number of iterations without improvement. Population-based methods deal in every iterations of the algorithm with a set of solutions rather than with a single solution. As they deal with a population of solutions, population-based algorithms provide a natural, intrinsic way for the exploration of the search space. The final performance depends strongly on the way the population is manipulated. The most studied population-based methods in combinatorial optimization are Evolutionary Computation (EC) and Ant Colony Optimization (ACO). In EC algorithms, a population of individuals is modified by recombination and mutation operators, and in ACO a colony of artificial ants is used to construct solutions guided by the pheromone trails and heuristic information. EC algorithms have been applied to most CO problems and optimization problems in general. Genetic algorithms belong to the larger class of evolutionary algorithms (EA), are population-based metaheuristics based on the iterative application of stochastic.

2.2.3

Genetic Algorithms

Genetic algorithms are general-purpose search algorithms based upon the principles of evolution observed in nature. Genetic algorithms combine selection, crossover, and mutation operators with the goal of finding the best solution to a problem. Genetic algorithms search for this optimal solution until a specified termination criterion is met.

CHAPTER 2. LITERATURE REVIEW

8

The solution to a problem is called a chromosome. A chromosome is made up of a collection of genes which are simply the parameters to be optimized. A genetic algorithm creates an initial population (a collection of chromosomes), evaluates this population, then evolves the population through multiple generations (using the genetic operators discussed above) in the search for a good solution for the problem at hand. Genetic algorithms can be applied to a wide variety of optimization problems such as scheduling, computer games, stock market trading, medical, adaptive control, transportation, the travelling salesmen problem, etc. Outline of the Basic Genetic Algorithm

[Start] Generate random population of n chromosomes (suitable solutions for the problem) [Fitness] Evaluate the fitness f(x) of each chromosome x in the population [New population] Create a new population by repeating following steps until the new population is complete [Selection] Select two parent chromosomes from a population according to their fitness (the better fitness, the bigger chance to be selected) [Crossover] With a crossover probability cross over the parents to form a new offspring (children). If no crossover was performed, offspring is an exact copy of parents. [Mutation] With a mutation probability mutate new offspring at each locus (position in chromosome). [Accepting] Place new offspring in a new population [Replace] Use new generated population for a further run of algorithm [Test] If the end condition is satisfied, stop, and return the best solution in current population [Loop] Go to step 2 Parameters of GA 1. Crossover probability says how often will be crossover performed. If crossover probability is 100%, then all offspring is made by crossover. Crossover is made in hope that new chromosomes will have good parts of old chromosomes and maybe the new chromosomes will be better. 2. Mutation probability says how often will be parts of chromosome mutated. If mutation is performed, part of chromosome is changed. Mutation is made to prevent falling GA into local extreme. 3. Population size says how many chromosomes are in population (in one generation). If there are too few chromosomes, GA have a few possibilities to perform crossover and only a small part of search space is explored.

CHAPTER 2. LITERATURE REVIEW

9

Encoding Encoding of chromosomes is one of the issues when you are starting to solve problem with GA and it vary depends on the problem. 1. ”Binary Encoding is the most common, In binary encoding every chromosome is a string of bits, 0 or 1, each chromosome encodes a binary (bit) string. [57] Binary encoding gives many possible chromosomes even with a small number of alleles. On the other hand, this encoding is often not natural for many problems and sometimes corrections must be made after crossover and/or mutation. 2. Permutation Encoding- Every chromosome is a string of numbers, which represents the number in sequence [57], and can be used in ordering problems. Permutation encoding is only useful for ordering problems. Even for this problems for some types of crossover and mutation corrections must be made to leave the chromosome consistent. 3. Value Encoding-direct value encoding can be used in problems, where some complicated values, such as real numbers, are used. Use of binary encoding for this type of problems would be very difficult. In value encoding, every chromosome is a string of some values .[57] Value encoding is very good for some special problems. On the other hand, for this encoding is often necessary to develop some new crossover and mutation specific for the problem. 4. Tree Encoding- This encoding is mainly used for evolving program expressions for genetic programming.[57] Initialization Initially many individual solutions are randomly generated to form an initial population. The population size depends on the nature of the problem, but typically contains several hundreds or thousands of possible solutions. GA Operator 1. Selection : Selection is a genetic operator that chooses a chromosome from the current generation’s population for inclusion in the next generation’s population. Before making it into the next generation’s population, selected chromosomes may undergo crossover and / or mutation (depending upon the probability of crossover and mutation) in which case the offspring chromosome(s) are actually the ones that make it into the next generation’s population. (a) Roulette Wheel Selection- A selection operator in which the chance of a chromosome getting selected is proportional to its fitness (or rank). This technique will have problems when the fitness’s differs very much. (b) Tournament-a number Tour of individuals is chosen randomly from the population and the best individual from this group is selected as parent. (c) Top Percent -A selection operator which randomly selects a chromosome from the top N percent of the population as specified by the user.

CHAPTER 2. LITERATURE REVIEW

10

(d) Best - A selection operator which selects the best chromosome (as determined by fitness). (e) Random- A selection operator which randomly selects a chromosome from the population. (f) Rank Selection-A selection operator which first ranks the population and then every chromosome receives fitness from this ranking. (g) Steady-State Selection-copies all but a few of the worst fit from the current population. The remainders are then further selected for mutation and crossover in the usual way. (h) Elitism- The first best chromosome or the few best chromosomes are copied to the new population .[57] The rest is done in classical way. Elitism can very rapidly increase performance of GA, because it prevents losing the best found solution. One of the biggest problems with this is over-specialization; the algorithm often finishes too quickly, with a solution that is less than optimal. (i) GENITOR: a.k.a. (delete-worst) 2. Crossover Crossover is the process of taking two parent solutions and producing from them a child.[57] The idea behind crossover is that the new chromosome may be better than both of the parents if it takes the best characteristics from each of the parents. Crossover occurs during evolution according to a user-definable crossover probability. The types of crossover: (a) One Pointis a crossover operator that randomly selects a crossover point within a chromosome then interchanges the two parent chromosomes at this point to produce two new offspring. (b) Two Pointis a crossover operator that randomly selects two crossover points within a chromosome then interchanges the two parent chromosomes between these points to produce two new offspring. (c) Uniform is a crossover operator that decides (with some probability - know as the mixing ratio) which parent will contribute each of the gene values in the offspring chromosomes. This allows the parent chromosomes to be mixed at the gene level rather than the segment level (as with one and two point crossover). For some problems, this additional flexibility outweighs the disadvantage of destroying building blocks. (d) Arithmeticis a crossover operator that linearly combines two parent chromosome vectors to produce two new offspring according to the following equations: O f f spring1 = a × Parent1 + (1 − a) × Parent2.

(2.1)

O f f spring2 = (1 − a) × Parent1 + a × Parent2.

(2.2)

(e) Heuristic is a crossover operator that uses the fitness values of the two parent chromosomes to determine the direction of the search. The offspring are created according to the following equations: O f f spring1 = BestParent + r × (BestParent − WorstParent).

(2.3)

O f f spring2 = BestParent.

(2.4)

CHAPTER 2. LITERATURE REVIEW

11

(f) Cut and splice is another crossover variant, where a results are change in length of the children strings. The reason for this difference is that each parent string has a separate choice of crossover point. (g) Three parent crossover,in this technique, the child is derived from three parents. They are randomly chosen. Each bit of first parent is checked with bit of second parent whether they are same. If same then the bit is taken for the offspring otherwise the bit from the third parent is taken for the offspring. (h) Crossover for Ordered Chromosomes The direct swap may not be possible when the chromosome is an ordered list, such as an ordered list of the cities to be travelled for the travelling salesman problem. There are many crossover methods for ordered chromosomes also the already mentioned N-point crossover can be applied for ordered chromosomes, but this always need a corresponding repair process. However, sometimes a crossover of chromosomes produces recombinations which violate the constraint of ordering and thus need to be repaired. Several examples for crossover operators that preserving a given order such PMX, OX1,OX2, POS, VR, AP,SCX, Subtour Exchange Crossover and Partially-Mapped Crossover. 3. Mutation Mutation is a genetic operator that alters one or more gene values in a chromosome from its initial state. This can result in entirely new gene values being added to the gene pool. With these new gene values, the genetic algorithm may be able to arrive at better solution than was previously possible. Mutation is an important part of the genetic search as it helps to prevent the population from stagnating at any local optima. Mutation occurs during evolution according to a user-definable mutation probability. This probability should usually be set fairly low (0.01 is a good first choice). If it is set to high, the search will turn into a primitive random search. (a) Flip Bit -A mutation operator that simply inverts the value of the chosen gene (0 goes to 1 and 1 goes to 0). This mutation operator can only be used for binary genes. (b) Boundary - A mutation operator that replaces the value of the chosen gene with either the upper or lower bound for that gene (chosen randomly). This mutation operator can only be used for integer and float genes. (c) Non-Uniform - A mutation operator that increases the probability that the amount of the mutation will be close to 0 as the generation number increases. This mutation operator keeps the population from stagnating in the early stages of the evolution then allows the genetic algorithm to fine tune the solution in the later stages of evolution. This mutation operator can only be used for integer and float genes. (d) Uniform -A mutation operator that replaces the value of the chosen gene with a uniform random value selected between the user-specified upper and lower bounds for that gene. This mutation operator can only be used for integer and float genes. (e) Gaussian -A mutation operator that adds a unit Gaussian distributed random value to the chosen gene. The new gene value is clipped if it falls outside of the user-specified lower or upper bounds for that gene. This mutation operator can only be used for integer and float genes. (f) Insert Mutation for permutations • Pick two allele values at random. • Move the second to follow the first, shifting the rest along to accommodate. • Note that this preserves most of the order and the adjacency information.

CHAPTER 2. LITERATURE REVIEW

12

(g) Swap mutation for permutations(Exchange Mutation) • Pick two alleles at random and swap their positions. • Preserves most of adjacency information (4 links broken), disrupts order more. (h) Inversion mutation for permutations • Pick two alleles at random and then invert the substring between them. • Preserves most adjacency information (only breaks two links) but disruptive of order information. (i) Scramble mutation for permutations • Pick a subset of genes at random. • Randomly rearrange the alleles in those positions. (j) Displacement Mutation • Select two random points. • Grab the genes between them as a group. • Then reinsert the group at a random position displaced from the original. (k) Displaced Inversion Mutation • Select two random points. • Reverse the gene order between the two points. • Displace them somewhere along the length of the original chromosome. This is similar to performing Inversion Mutation and then Displacement Mutation using the same start and end points. 4. Replacement Replacement schemes are used by evolutionary algorithms to determine how the new individuals will be assimilated into the population. Replace worst and replace mostsimilar are the only really useful replacement schemes. Sometimes replace-parent can be effective, but usually when the parents are similar to the offspring, and this is just replace-most-similar. Some examples of replacement policies are given below: • Replace if better: The offspring replaces the current individual only if its fitness value is greater. • Replace always: The offspring replaces the current individual always. • Replace worst: The offspring replaces the worst individual of the neighbourhood. • Replace best: The offspring replaces the best individual of the neighbourhood. • Replace random: The offspring replaces an individual of the neighbourhood selected randomly. • Replace parent: The offspring replaces one of its parents. • Replace most similar (crowding): The offspring replaces the individual of the neighbourhood with closest chromosome information. 5. Termination Termination is the criterion by which the genetic algorithm decides whether to continue searching or stop the search. Each of the enabled termination criterion is checked after each generation to see if it is time to stop. (a) Generation Number - A termination method that stops the evolution when the userspecified max numbers of evolutions have been run. This termination method is always active.

CHAPTER 2. LITERATURE REVIEW

13

(b) Evolution Time -- A termination method that stops the evolution when the elapsed evolution time exceeds the user-specified max evolution time. By default, the evolution is not stopped until the evolution of the current generation has completed. (c) Fitness Threshold -- A termination method that stops the evolution when the best fitness in the current population becomes less than the user-specified fitness threshold and the objective is set to minimize the fitness. This termination method also stops the evolution when the best fitness in the current population becomes greater than the user-specified fitness threshold when the objective is to maximize the fitness.

2.3

Related work

Teacher Course Assignment Problem is a widely studied area and many researchers have developed a potentially useful algorithms to solve Teacher Course Assignment Problem. ” Solving the Teacher Assignment Problem by Two Metaheuristics” (Aldy Gunawan and Kien Ming , 2011)[7] proposed simulated annealing search and tabu search techniques to produce an optimal solution to the problem, The problem formulated in this paper to allow each course section to be taught by more than one teacher, the teachers could be full time or part time teachers, Also, the definition of the teachers load in this paper is different, it is referred to as the number of courses taught instead of number of assigned hours, The penalty values represent the excess of the total number of courses taught by full time teachers who have to teach more than ”R” predefined course courses. Simulated Annealing 1 is a generic probabilistic metaheuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space, SA mechanism, as its name implies, exploits an analogy between the way in which a metal cools and freezes into a minimum energy crystalline structure (the annealing process) and the search for a minimum in a more general system. it is a means of finding the equilibrium configuration of a collection of atoms at a given temperature, the major characteristic of simulated annealing is to avoid getting trapped at a local minimum by accepting neighbourhood moves. This paper uses geometric cooling schedule 2 in SA and implements the following SA sequence solution:

1 2

Described by Scott Kirkpatrick, C. Daniel Gelatt and Mario P. Vecchi in 1983 Cooling schedules may alter the speed at which the temperature is reduced depending on the current iteration

CHAPTER 2. LITERATURE REVIEW

14

Figure 2.1: SA sequence solution

The statistical results 3 show that the SA algorithms yield better solution when compared to a manual allocation procedures. the provided solution has advantages and disadvantages. among its advantages are the relative ease of implementation and the ability to provide reasonably good solution. though a robust technique, its drawbacks include the need for a great deal of computer time for many runs. Tabu search 4 is a single state optimization method that uses a form of short-term memory used to keep a search from becoming trapped in a local optima. A tabu list is formed that keeps track of recent solutions. at each iteration in the optimization process, solutions are checked against the tabu list. a solution that is on the list will not be chosen for the next iteration (unless it overrules its tabu condition by what is called an aspiration condition.) The tabu list forms the core of tabu search and keeps the process from cycling in one neighbourhood of the solution space. TS searches the new solution from a subset or the whole neighbourhood space for a good solution with certain restrictions. When TS get a better solution, it treats the new one as the current solution. However, when there is no better solution, the best solution in the neighbourhood is picked. The result for the manual allocation is worse than that of the SA and TS algorithms. This is because in practice, the administrator doesn’t limit the number of courses taught by full time teacher, thus unbalanced load among full time teachers will produced.

3

” Solving the Teacher Assignment Problem by Two Metaheuristics ” , by Aldy Gunawan and Kien Ming Ng, pg:84, Table.6 4 created by Fred W. Glover and in 1986

CHAPTER 2. LITERATURE REVIEW

15

”An Application of Genetic Algorithm methods for teacher assignment problem” (Yen-Zen Wang, 2002)[48] proposed Genetic Algorithm 5 to solve the problem, it uses the standard version of GA to implement teacher assignment problem; the problem formulated in this paper to maximize teacher’s preferences, to handle teacher’s upper and lower load constraints and to reduce the variance in teachers assigned hours. Genetic Algorithms proved to be efficient in finding optimal/suboptimal solution in a reasonable time. The overall effect of GAs work is to move the population P towards areas of the solution space with higher values of the fitness function. GA search is directed by the fitness function. This direction is not based on whole chromosomes, but on their parts which are strongly related to high values of the fitness function; these parts are called building blocks. It has been demonstrated that GAs are very efficient at processing building blocks. GAs is therefore useful for every problem where an optimal solution is composed of a collection of building blocks. This computational paradigm allows an effective search in very large search spaces. It has been recently applied to various kinds of optimization problems. in this paper a binary representation is used for the coding of each solution. The operating process of genetic algorithms is to search the solution space, which is constructed by the combination of parameters. Mutation operation is to randomly select any chromosome with a prespecified probability. The selected chromosome is then decoded into its binary equivalent. In this paper, due to the differences in the genetic representation elements of the solving set, the solution of the operational process of the genetic operator should be partially modified. Whereas the operation process of the mutation operator is shown as follows: 1. Use a random process to select a mutational chromosome within the population, say SX . 2. Use a random process to select a course C y in SX . 3. Use a random process to select a teacher in ΦC y , and exchange it into matrix C y . Crossover is an operation to generate a new string (i.e. child) from two parent strings. It provides a mechanism for chromosomes to mix and match through random processes. The operation process of the crossover operator is shown as follows:

1. Use a random process to select a paired chromosome within the population, say SA and SB . 2. In chromosome SA and SB , use a random process to select a crossover point cp , where 1 ≤ cp ≤ Σn(Ci ). 3. Based on point cp to exchange the parameter’s combination of SA and SB . 4. Repeat (a), (b) and (c), until all chromosomes in the population are finished in the crossover process. The processes that use genetic algorithms to solve teacher assignment problems are indicated as follows: 1. Use a random process to create the initial population 2. Calculate the fitness function (F) of each chromosome and evaluate penalty function for each upper/lower constraints violations. 5

The notion genetic algorithm was first introduced by Bagley in [BAGL67], who used them to solve problems of game theory.

CHAPTER 2. LITERATURE REVIEW

16

3. According to each chromosome’s fitness function that derived from the population, the genetic operators (reproduction, crossover, and mutation) are applied to create new combinations of problems with teacher assignments. 4. Replace the new offspring in population based on replacement strategy. 5. Repeat (2), (3), and (4), until the fitness function is satisfied or the default number of runs is achieved. Comparing the GA results by traditional ways that takes five days to schedule the course timetabling at Far East collage. This method can save more time and satisfy teachers expectations for scheduling the timetable. While SA creates a new solution by modifying only one solution with a local move, GA also creates solutions by combining two different solutions. Whether this actually makes the algorithm better or worse. Although GA has many advantages over SA that guaranteed all solution will be in a favourable direction but in guaranteed that the global minimum will be obtained. GA keeps around a sample of candidate solutions rather than a single candidate solution which is tabu search and all single state search strategy. Each of the solutions is involved. This could happen either by good solutions causing poor solutions to be rejected and new ones created, or by causing them to be tweaked in the direction of the better solutions. Also GA has another privilege over tabu search which guaranteed that all solution will be in a favourable direction but in tabu search there is no mathematical guaranteed. In typical applications, genetic algorithms (GAs) process populations of potential problem solutions to evolve a single population member that specifies an ”optimized” solution. In cooperative parallel individuals are run as a set of processes that cooperate periodically and exchange information to solve the problem efficiently. Different method can be applied on cooperative GA In ”Parallel Processing of Cooperative Genetic Algorithm for Nurse Scheduling” (Makoto Ohki, Shin-ya Uneme, Hikaru Kawano,2008)[30] One of these methods applied on secluding problem for nursing the proposed method is having three processes initially generated, where there is no limitation about the number of processes. The first process, Proc0, is generated on a computer that an operator starts the nurse scheduling. Those processes communicate every GC generation period. The value of GC should be defined to a multiple of GM. The Proc0 manages the communication. In the communication, all the processes send the best schedule acquired by the optimization for GC generations to the process Proc0. TheProc0 selects the best schedule among the schedules sent from all the processes and sends it to all the processes. Each process starts their optimization again with the best schedule given by the communication. Therefore, more possibility is always searched for near the best solution.

2.4

Conclusion

This chapter has offered a detailed description of the different kinds of metaheuristics and different characteristics. it discussed three of teacher assignment solution approaches and presented a comparison between three algorithms for solving Combinatorial problems. after that it tried to arrive to a trade-off between detailed description of the metaheuristic working principles and a fast survey of techniques.

CHAPTER 2. LITERATURE REVIEW

17

18

Chapter 3

Problem Formulation 3.1 3.1.1

Problem Description Overview

It is widely known that a well-formulated problem is a problem that is nearly solved. By formulating the problem , the solution becomes obvious or is more easily obtained, well-defined problem is a key to a successful problem solving. This chapter is to formulate teacher assignment problem in mathematical model. The scheduling of courses has become a tested of numerous optimization methods, plausibly as it affects the livelihood of a similarly vast number of operations researchers. We focus on the assignment of Courses to teachers. The problem faced in teacher assignment is how to assign and schedule the teachers to the Courses and Course sections in high efficient way, while ensuring that all conflicted goals are considered. The requirements imposed are as follows: 1. All courses’ sections will be staffed. 2. Each course section can only be taught by one teacher. 3. Teachers is overloaded if needed. 4. Each teacher will not be assigned courses that he/she is unable to teach in term of teaching requirements (specialization/degree). 5. Each teacher’s preferences need to be evaluated. 6. teacher’s experiences should be considered. 7. Teacher’s assigned load should be balanced with other teachers. 8. One teacher should not be appointed to teach too many different courses, in order to avoid extraneous burden. 9. Achieved teacher’s preference should be balanced with other teachers.

3.1.2

Assumptions

1. We assume the teacher enter the preferences in order. 2. We assume that there is some courses needs specific level of teaching skills (degree)

CHAPTER 3. PROBLEM FORMULATION

3.1.3

Given Data

1. Course namec = course name. Spc = required teacher’s specialization. LDc = lower required teacher’s degree. Lc = course load (hours). 2. Sectionc IDcs = section identification. Ccs = course name. Tcs = Assigned Teacher identifier. 3. Teacher Spt = Specialization list of teacher t. ULt = upper load of teacher t. Dt = degree of teacher t. Wt = Set of the preferred courses to teacher t. ∀t ∈ T : Wt = C1, C2, C3, ..., Cnt Et = Set of teacher t experiences. ∀t ∈ T : Et = (C1, #year), (C2, #year), (C3, #year), ..., (Crt, #year) where r is number of experienced course.

3.2

Mathematical formulation

Teacher assignment problem formulated as a CSP where each variable Section S = {Sik |1 ≤ i ≤ n : 1 ≤ k ≤ |Ci |} corresponding to teacher from the domain Dik = T within the constraints that expressed algebraically as : ∀ j ∈ T, ∀i ∈ S : Xi j → j B i where all courses’ sections have to appointed. ∀ j ∈ T, ∀i ∈ S : Xi j → (Sp j = Spi D j ≥ Di ) to fit the course teaching requirements. ∀ j, k ∈ T, ∀i ∈ S : Xi j ∧ Xik → (j = k) where only one teacher has to be appointed to the section.

19

CHAPTER 3. PROBLEM FORMULATION

20

Each solution should to be validate the above hard constraints to be a valid solution, handling teacher upper load constraints expressed as: ∀j ∈ T : AH j ≤ UL j We treat the upper load constraint as a soft constraint, if the solution exceeds the upper load, penalty function will calculated as : AHi − ULi ; ifAHi >ULi 0 otherwise

( Pi =

(3.1)

The objective function is broken up into many objective functions aim to maximize teacher preference, the assignment fairness, and teaching quality. The order of preferred course of the teacher is calculated as : (

ordero f Ci inW j ; ifCi ∈ W j 0 ; ifCi < W j

α(T j , Si k) = ( β(T j , Sik ) =

(n − α(T j , Sik ))/n ; ifCi ∈ W j 0 ; ifCi < W j

The total achieved preference is f1 : f1 =

|Ci | n X X

β(Sik [T], Sik )

(3.2)

i=0 k=0

Teacher’s experiences in specific subject calculated as:  E j [Ci ]    Pm Ee[Ci ] ; ifCi ∈ E j γ(T j , Sik ) =  e=0   0 ; ifCi < E j f2 =

|Ci | n X X

γ(Sik [T], Sik )

(3.3)

i=0 k=0

Fairness in number of teachers assigned courses is calculated as : setAC = {AC j : ∀ j ∈ T} f 3 = var(setAC)

(3.4)

Fairness of number of teachers assigned hours represented as : AH j

setAH = { UL j : ∀j ∈ T} f 4 = var(setAH) Fairness on teacher achieved preferences: AcheivedPR j = {totalAchievedpre f erences f orteacher j : ∀ j ∈ m} f =

n X

|ci | X

i=0 k=0PR j +β(Sik [T],Sik )

(3.5)

CHAPTER 3. PROBLEM FORMULATION

21

f 5 = var(setAcheivedPre f erences)

3.2.1

(3.6)

Normalization

Objective functions resultants will be numeric values with a different ranges, the following process will unite the resultants format to be in the range [0,1]. #Section =

Pn

i=0 |Ci |

f1 =

f1 #Sections

f2 =

f2 #Sections

setmaxCourseVariance = {n, 0} f3 =

f3 var(setmaxCourseVariance)

P setmaxHourVariance = { #Sections Sk .C[H], 0} k f4 =

f4 var(setmaxHourVariance)

setMaxVariancePre f erences = {#Sections, 0} f5 =

f5 var(setMaxVariancePre f erences)

The object function is maximize f 1 , maximize f 2 ,minimize f 3, minimize f 4, minimize f 5 these are combined at one objective function F Maximize : F = w0 ∗ f 1 + w2 ∗ f 2 − w3 ∗ f 3 − w4 ∗ f 4 − w5 ∗ f 5 And to handle the soft constraint 3.1: F = F − Pi

(3.7)

CHAPTER 3. PROBLEM FORMULATION

3.3

Problem Structure

3.4

Summary

22

In this chapter, we have proposed a mathematical formulation of teacher assignment problem. This formulation is to identify the constraints on either the independent or dependent variables. We provided objective function which indicates how much each variable contributes to the value to be optimized in the problem. Our proposed objective function is to maximize teacher preference, Teacher’s experiences and to minimize the fairness in number of teachers assigned courses , fairness of number of teachers assigned hours and fairness on teacher achieved preferences.

CHAPTER 3. PROBLEM FORMULATION

23

24

Chapter 4

Applying GA on Assignment Problem 4.1

Algorithms Design

There are several usability-related issues, methods, and procedures that require careful consideration when designing and developing Intelligent Assignment software. The most important of these are presented in this chapter, including genetic algorithm different operators: initialization, selection, crossover, mutation and replacement also many others important parts must be considered in designing process 1 .

1

Check appendix c.1 for more details about different GA operators pseudo code

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

25

1. Create individual Individuals are created by assign a random teacher from qualified teacher list to each section to grantee that all courses achieve its requirements. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

26

2. Initialization In initialization step, the create individual will be called to initialize population chromosomes. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

27

3. Fitness To compute objective function for many factors including teacher preferences considerateness, experience and fairness between teachers according to assigned courses, assigned hours and satisfaction for each teachers preferences. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

3.1 Preferences This method computes teacher satisfied preferences and normalize the result. (a) Flowchart:

28

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

3.2 Experience This method computes teacher‘s experience and normalizes it. (a) Flowchart:

29

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

30

4. Selection Selecting an individual from a population of individuals to apply genetic operators, the chosen selection method is tournament selection that involves running several ”tournaments” among a few individuals chosen at random from the population. The winner of each tournament (the one with the best fitness) is selected for crossover. If the tournament size is larger, weak individuals have a smaller chance to be selected. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

31

5. crossover Crossover method is also one of the genetic operators that takes two parents (chromosomes) and generate 2 new offspring try to discover most of the search space. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

32

6. Mutation This method takes a chromosome as an input, mutates a point and replaces its assigned teacher by other qualified teacher for this course. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

33

7. Replacement In replacement, a weaker parents is replaced by stronger children, sort parents population and new children chromosomes according to their fitness is become as a first step, the sorting process is done with a heap sort that has an advantage of a more favourable worstcase O(n log n) runtime, after sorting step, comparison between parent and children to replace the worst (n) parents only if the new children is better. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

34

8. Validation Validation will apply a penalty function if a chromosome violates the upper load hard constraint by assign hours to teacher more than his upper load, the penalty function will compute the distance between feasible solution and infeasible one then subtract it from the chromosome fitness. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

35

9. Select Solution To obtain the best assignment among all discovered search space, select chromosome that has a highest fitness among all chromosomes. (a) Flowchart:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

10. Genetic algorithm (a) Flowchart:

36

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

4.2

37

Experimental

Genetic algorithms can be a very successful means of problems. This success, however, depends on many factors: the type of crossover operator, the selection mechanism, the type of mutation, population size and the optimal parameter setting. Choosing GA optimal parameters and GA operator degree setting depend on experiment until they achieve adequate performance for the given problem. The result of several experiments done to find the most successful combinations of parameter settings and other is shown in this section. In order to evaluate the performance of the proposed solutions numerically,this section has used a real data sets 2 . All the proposed cases for experiment the problems are coded in Java and tested on Inter(R) Core(TM) i7 2.20 GHz with 8.00 GB RAM under the Microsoft Windows.7 Operating System. For each data set, each case was executed twenty times. each time the computational result recorded in a connected data base automatically, the recorded information is the achieved fitness besides the execution time, a summary of each case (twenty test results) will recorded in other table of data base, it contains the maximum fitness achieved for all twenty test, the minimum fitness, the average fitness among the twenty test, the mode (most repeated) fitness, the standard deviation and the average of the consumed time in seconds.

2

check appendix A ’Experiment Data Set’

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

4.2.1

38

Comparison of IAE Algorithm and literature Algorithm

Teacher assignment problem arise in many area in the real life, many researches discuss this problem under different constraints and many defined factors, the following experiment will discuss one of that solutions and IAE solution, and it will compare the two completed algorithms each with its own characteristics. ”An application of genetic algorithm methods for teacher assignment problem” paper (YenZen Wang, 2002)(YZW)uses the standard version of GA to implement teacher assignment problem; it aims to maximize teacher’s preferences, to handle teacher’s upper and lower load constraints and to reduce the variance in teachers assigned hours. Intelligent assignment engine(IAE) uses a special version of GA with the best detected parameter 3 and evaluated operators; IAE aims to maximize teacher’s preferences and experiences and to minimize the variance in assigned hours, courses and achieved preferences and to handle teacher’s upper load and course teaching requirements constraints, IAE algorithm uses tournament selection method which obtained better performance on many factors that effect on the solution compared by roulette wheel method 4 , in reproduction process IAE uses one point crossover which is achieved better solutions than two points crossover 5 , and uses section-based mutation after compare it with course-based mutation 6 , in replacement stage IAE uses unsorted replacement mechanism which consumed better elapsed time than ordered replacement mechanism .

3

check appendix C.3 check appendix C.2 5 check appendix C.2 6 check appendix C.2 4

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

39

Comparing IAE Algorithm with YZW algorithm by applying the same dataset, evaluating the solutions by the same evaluation model and using different implemented algorithms. Comparison result: Algorithms with factors IAE Algorithm YZW Algorithm

AVG 0.9274 0.746168

MAX 0.8109 0.746168

MIN 0.87786 0.746168

STD 0.018861 -

MODE 0.9274 0.746168

Table 4.1: comparing IAE algorithm and Yen-Zen Wang algorithm

Figure 4.1: IAE algorithm and YZW algorithm Representation from the statistical results, minimum fitness values provided by IAE algorithm is higher than the maximum fitness value provided by Yen-Zen Wang algorithm (See Fig.4.1), and the frequently fitness value of IAE algorithm is 0.91 versus 0.74 in the best case for Yen-Zen Wang algorithm.

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

4.2.2

40

Comparison of IAE problem model and Literature problem model

The Teacher course assignment is an optimization problem; many factors must be obtained under many different constraints, some researches interested on this problem with a special considered factors and special formulated models, ”An application of genetic algorithm methods for teacher assignment problem” paper (Yen-Zen Wang, 2002) ”named YZW model” discussed this problem, it aims to maximize teacher preferences and fairness of teaching hours while handling the lower and upper teacher load constraint. ”YZW model” formulated a teacher course assignment problem as the following: ( Ord(Ci , T j ) =

ordero f Ci inW j ; ifCi ∈ Wt j 0 ; ifCi < Wt j

To calculate the order of the preferred course C j to the teacher T j . ( Λ − ωOrd(Ci , T j ) ; ifCi ∈ Wt j Γ(Ci , T j ) = 0 ; ifCi < Wt j Λ is a positive integer and w is a positive weight. The objective function is to maximize the teachers preferences’ and to handle the upper and lower load constraints.

maxJ =

n(T) n(c) X m X X

Γ(Ci j, Tk )δi jk

i=1 j=1 k=1

Pn(T) k

δi jk = 1∀i ∈ {1, 2, , m}, j ∈ {1, 2, , n(c)}

where δi jk = 1 indicated that each section of different classes can be appointed to only one teacher ”YZW model” maintains the upper and lower load constraints as: Btk ≤

Pm Pn(c) i=1

j=1

r j δi jk ≤ Utk ∀Tk ∈ T

”YZW model” treats the upper and lower and assigned hours variance constraints as soft constraints, if the solution exceeds the upper/lower load or has high variance in teachers assigned hours, penalty function will calculated as:   α(Qi − Bti ) ; ifQi Uti Pi1 =     0 otherwise   γ(Qi − (Bti + Oavg )) ; ifQi (Bti + Oavg ) Pi2 =     0 otherwise The penalty function of each subject scheduled is:

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

J1 =

n(t) X

41

(Pi1 + Pi2 )

i=1

”YZW” objective function is: MaximizeF = J0 − J1 IAE focused on teacher course assignment problem by maximize teacher preferences, teacher experiences, fairness of assigned hours, fairness of number of assigned courses, and fairness of teachers achieved preferences, while handle upper load and unique management constraints. IAE model formulated teacher assignment problem as the following: Order of preferred course of the teacher calculated as: (

ordero f Ci inW j ; ifCi ∈ W j 0 ; ifCi < W j

(

n − α(T j , Sik )/n ; ifCi ∈ W j 0 ; ifCi < W j

α(T j , Si k) = β(T j , Sik ) = The total achieved preference is f1 : f1 =

|Ci | n X X

β(Sik [T], Sik )

(4.1)

i=0 k=0

Teacher’s experiences in specific subject calculated as:  E j [Ci ]    Pm Ee[Ci ] ; ifCi ∈ E j γ(T j , Sik ) =  e=0   0 ; ifCi < E j f2 =

|Ci | n X X

γ(Sik [T], Sik )

(4.2)

i=0 k=0

Fairness of number of teachers assigned courses : setAC = {AC j : ∀ j ∈ T} f 3 = var(setAC)

(4.3)

Fairness of number of teachers assigned hours : AH j

setAH = { UL j : ∀j ∈ T} f 4 = var(setAH) Fairness of teacher achieved preferences: AcheivedPR j = {totalAchievedpre f erences f orteacher j : ∀ j ∈ m} setAcheivedPre f erences = {Sik [T].PR+ = β(Sik [T], Sik ) : ∀i ∈ n∀k ∈ |Ci |}

(4.4)

CHAPTER 4. APPLYING GA ON ASSIGNMENT PROBLEM

42

f 5 = var(setAcheivedPre f erences)

(4.5)

The object function is maximize f 1, maximize f 2, minimize f 3, minimize f 4, minimize f 5 these are combined at one objective function F Maximize :

F = w0 ∗ f 1 + w2 ∗ f 2 − w3 ∗ f 3 − w4 ∗ f 4 − w5 ∗ f 5

(4.6)

IAE treat the upper load as a soft constraint, if the solution exceeds the upper load, penalty function will calculated as: ( Pi =

AHi − ULi ; ifAHi fitness value of best chromosome then set chromosome i is best chromosome i=i+1 return p

APPENDIX C.

138

3. Fitness (a) Algorithm: Algorithm 3: Fitness(ref Chromosome c) this algorithm return the fitness value of the chromes pre: each Chromosome contain a list of course’s section, list of teachers and fitness function value. each section contain assigned teacher info, course info post: return fitness value return :nothing preference = NORMALIZE ( preference (c) ) Experience = NORMALIZE ( Experience (c) ) courses variance = NORMALIZE ( variance(c) ) hours variance = NORMALIZE ( variance(c)) Preference variance= NORMALIZE ( variance(c) ) fitness = preference + experience -courses variance - hours variance - Preference variance 1

1

See ”sample variance”, ”introduction to statistics and data analysis, probability and statistics for engineering and scientists, 8th edition, pg.15

APPENDIX C.

139

4. Preferences (a) algorithm : Algorithm 4: Preference this algorithm will calculate the preferred courses order assigned to the teacher pre: completed copy from a Chromosome post: order of preferred courses Assigned return :preference preference=0 Loop counter= 0 to size of section Assignedcourse = name of course assigned to this section[counter] Assignedteacher= name of teacher assigned to this section[counter] List of Preferred course = Assigned teacher preferred courses order=0 Loop j=0 to size of Preferred course list if Assignedcourse is equal to Preferred course j then order= (num-course-j)/(num-course) AssignedteacherTotal prefrence for assigned courses= AssignedteacheTotal prefrence for assigned courses + order preference= preference +order preference=preference/num Section return preference

APPENDIX C.

140

5. Experience (a) algorithm : Algorithm 5: Experience (Chromosome c) this algorithm will calculate the experience of the teacher on the assigned course pre: completed copy from a chromo me post: retun teacher experience return :TExperince. TExperince=0 Loop counter=0 to size of section Assignedcourse = name of course assigned to this section[counter] Assignedteacher= name of teacher assigned to this section[counter] experince=0 Loop until size of Assignedteacherexperenice if Assignedcoures==Assignedteacher course taught befor by him/his then experince=number of time Assignedteacher taught this course / experience of all teacher TExperince= TExperince +experince TExperince=TExperince/num Section

APPENDIX C.

141

6. Selection (a) algorithm : Algorithm 6: Selection(ref list of Population) this algorithm return the best Chromosome with highest fitness function among all Chromosomes which have been chosen at random (tournament selection) pre: each population contain a list of chromosomes, and each chromosome contain a list of course’s section, list of teachers and fitness function value. each section contain assigned teacher info, course info post: return Chromosome return :the Best Chromosome list of chromosome i=0 Loop form 0 to size of chromosome list chromosome i =select chromosome randomly form population i=i+1 best chromosome =chromosome 0 i=1 Loop form 1 to size of chromosome list if fitness of chromosome i >fitness of best chromosome) then best chromosome =chromosome i i=i+1 return best chromosome

APPENDIX C.

142

7. crossover (a) algorithm : Algorithm 7: crossover(val chrom1 , val chrom2 ) this algorithm will generate new offspring of a given parent by using crossover operator pre: two completed assignments post: two new offspring return :two new offspring select random number from section swap two chromosomes at the selected random section Fitness Function (chromosome x1) Fitness Function (chromosome x2) return x1,x2

APPENDIX C.

143

8. Mutation (a) algorithm : Algorithm 8: Mutation (val chrom ) this algorithm will generate a new offspring by mutates the given chromosome pre: completed assignment post: new chromosome return :new chromosome chromosome x1 = chrom Select random section from x1 set null teacher for the selected section choose random teacher from qualified teacher list for this course set the selected teacher for the selected section Fitness (x1) return x1

APPENDIX C.

144

9. Replacement (a) algorithm : Algorithm 9: Replacement(VAL Chromosome N parent, val chromosome n offspring) this algorithm will arrange the two parents and 2 offspring to select and copy two highest individuals which are better. pre: list of 2 parents and of new offspring Sort parents and offspring in array copy two best of them to parents

APPENDIX C.

145

10. Validation (a) algorithm : Algorithm 10: validation ( ref chrom ) this algorithm check on assignment validation’s; in each assignment teacher’s degree and specialization must fit the course degree and specialization, no two teacher appointed to teach the same section, and the assigned hour to the teacher must be within the teacher allowable load. pre: each chromosome contain a list of course’s section, list of teachers and fitness function value. each section contain assigned teacher info, course info post: checking result return :false if the assignment violated constraints ,true if it is valid valid = true Loop until last teacher if the teacher assigned hour greeter than his upper load then Penelty= (teacher assignedhour - teacher upperLoad) / num sections subtract the Penelty value from chrom fitness value valid = false return (valid)

APPENDIX C.

146

11. Select Solution (a) algorithm : Algorithm 11: SelectSolution(Population p) this algorithm return the best solution with highest fitness function among all solution in all population pre: the population contain a list of chromosomes, and each chromosome contain a list of course’s section, list of teachers and fitness function value. Each section contains assigned teacher info, course info. post: return solution return :the Best solution Solution=best chromosome fitness in p return (Solution)

APPENDIX C.

147

12. Genetic algorithm (a) algorithm : Algorithm 12: GA () This algorithm combines all GA operators in required order to obtain solution. pre: nothing post: the best assignment has been computed return :assignment solution Initialization Loop GeneLen copy elites Loop population size/2 selection if it in crossover rate then crossover else only copy parents to offspring if it in mutation rate then mutate first offspring if it in mutation rate then mutate second offspring replacement replace worst chromosomes with the copied elites select solution

APPENDIX C.

C.2

148

Genetic Algorithm Operator Test

There are many variations of the standard GA operators, most of which can be applied in varying degrees. Each operator contributes in some way to either the amount that a GA will explore the search space or the degree to which it will exploit individuals. GA operator testing have been done under 100 chromosomes, 500 population number, pool with size 4, 90% crossover rate, 40% mutation rate, and default GA operator: tournament selection approach, single point crossover, section-based mutation and unsorted replacement. Selection is an essential stage of a genetic algorithm in which chromosomes are chosen from a population for later reproduction. Selection mechanism has many strategies each strategy tries to reduce the search effort and to raise the chance to reach to the optimal solution in a certain way. The following experiment (Table.C.1) evaluates two type of selection strategy, first approach is tournament selection, which involves running several ”tournaments” among a few chromosomes chosen at random from the population, and the winner of each tournament (the one with the best fitness) is selected. Second approach is roulette-wheel selection, also called stochastic sampling with replacement which maps the chromosomes to contiguous segments of a line, such that each chromosome’s segment is equal in size to its fitness. A random number is generated and the individual whose segment spans the random number is selected. Selection Strategy Roulette Wheel Method Tournament Method

AVG 1.14557 1.14806

MAX 1.1567 1.1567

MIN 1.1252 1.1328

STD 0.009187 0.006599

MODE 1.1567 1.1567

Table C.1: Selection Strategies Results

Figure C.1: Selection Strategies Results Error Bar From table.C.1 we can see that there is many different between the two approaches, Tournament Strategy has a higher average fitness, smaller standard deviation and reduced the chance to select the MIN fitness.

APPENDIX C.

149

Crossover with 2-point takes parent individuals, randomly selects two cut off points along the chromosomes, and then swaps all of the genes that fall in between those two points. A single crossover point on both parents’ organism strings is selected, all data beyond that point in either organism string is swapped between the two parent organisms. The resulting organisms are the children. In the following experiment (Table.C.2) two versions of crossover operator one and two points have been used under the same circumstance: Crossover method One Point Two Point

AVG 1.14547 1.14358

MAX 1.15672 1.15672

MIN 1.13147 1.12522

STD 0.008401 0.010198

MODE 1.14043 1.15612

Table C.2: Crossover split points Results

Figure C.2: Crossover split points Error Bar From the statistical results, the one point recorded higher result than two point version in terms of average, minimum fitness and standard deviation. The single crossover point maintains diversity during search process more than 2-point crossover, so it has a higher quality solution.

APPENDIX C.

150

Mutation used to maintain genetic diversity from one generation to the next. In mutation, the solution may change entirely from the previous solution. Hence GA can come to better solution by using mutation. In the following experiment (Table.C.3) mutation operator have been mutate the chromes in two different ways course-based mutation and section-based mutation, both of them have been used under the same circumstance: Mutation method One Course One Section

AVG 0.629861 1.14636

MAX 0.723298 1.15612

MIN 0.555368 1.13301

STD 0.049554 0.007144

MODE 0.723298 1.14043

Table C.3: Mutation Mechanisms Results

Figure C.3: Mutation Mechanisms Error Bar

By analysing the statistical results, the section-based mutation operator gives a better fitness than the other one because the probability has been set low and the other has been set to high, so the search will turn into a primitive random search.

APPENDIX C.

151

Replacement operator have used in two ways, one with sorting algorithm and without sorting, the following experiment (Table.C.4) showing the comparison result. Replacement method Sort without Sort

AVG 1.14917 1.14732

MAX 1.15612 1.15612

MIN 1.13301 1.12582

STD 0.006106 0.00759

MODE 1.14796 1.14784

time 29 4

Table C.4: Replacement Mechanisms Results

Figure C.4: Replacement Mechanisms consumed time

Figure C.5: Replacement Mechanisms Error Bar There is no a big difference between achieved fitness in both mechanisms but on the other hand the sorted replacement consumed too larger time than unsorted one.

APPENDIX C.

C.3

152

Genetic Algorithm Parameter Setting

Optimal parameter selection is a crucial step in improving the quality of genetic algorithm. This section reveals the results of several experiments to find the most successful combinations of parameter settings. GA parameters setting testing have been done under the GA operator: tournament selection approach, single point crossover, section-based mutation and unsorted replacement. Generation is a set of solution (chromosomes) each generation produced from other generation, having many generation can contributes to explore the search space but can decrease the overall efficiency, the optimal generation numbers produces the highest fitness in the least time. The following experiment Table.C.5 displays the achieved fitness value and the consumed time for each generation number. The default population size is 100, pool size is 3, crossover rate is 0.9, and mutation rate is 0.4. Generation Number 50 100 150 200 250 300 350 400 450 500 550 600 650 700 750 800 850 900

AVG 1.03785 1.11249 1.13129 1.13652 1.13732 1.13732 1.1377 1.13808 1.13808 1.13808 1.13808 1.13808 1.13808 1.13808 1.13808 1.13808 1.13808 1.13808

MAX 1.0858 1.1397 1.1484 1.1485 1.1485 1.1485 1.1561 1.1561 1.1561 1.1561 1.1561 1.1561 1.1561 1.1561 1.1561 1.1561 1.1561 1.1561

MIN 0.9731 1.0787 1.1084 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252 1.1252

STD 0.03139 0.014579 0.009513 0.008496 0.008472 0.008472 0.009111 0.009049 0.009049 0.009049 0.009049 0.009049 0.009049 0.009049 0.009049 0.009049 0.009049 0.009049

MODE 1.0858 1.1397 1.1404 1.1252 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404 1.1404

Table C.5: Generation Numbers and Fitness Values

Consumed Time 4 9 14 19 24 29 34 39 45 49 55 60 67 72 77 82 87 92

APPENDIX C.

153

Figure C.6: Generations-Time Representation

Figure C.7: Generation Numbers and Error Bar

By analyzing the statistical result, increasing the generation number will increase the fitness value and the consumed time until the generation number 400, after that, increasing in fitness values will become too slow (See Fig.C.7) because the solutions begin repeat, while the elapsed time still increases fixedly (See Fig.C.6). Most of GA with 400 generations results produces fitness more than the average. Its standard deviation is the lowest and in it consumed a reasonable time, therefore the best generation number is 400. (Selecting generation 500 for more reservation and to avoid the differentiate between different problem instances)

APPENDIX C.

154

Chromosome is a completed solution; the number of individuals contained in the initial population is an important parameter for the GA. this section analyzes the influence of this parameter on the performance of the GA, from an experimental perspective. The following experiment Table.C.6 displays the achieved fitness value and the consumed time for each population size (chromosome number). The default Generation Number is 400, pool size is 3, crossover rate is 0.9, and mutation rate is 0.4. Population size 25 50 75 100 125 150 175 200

AVG 1.1327 1.14378 1.14299 1.14509 1.14741 1.14637 1.1469 1.14428

MAX 1.1541 1.1567 1.1555 1.1561 1.1567 1.1561 1.1561 1.1561

MIN 1.1056 1.1236 1.1252 1.1252 1.1326 1.1308 1.1328 1.1252

STD 0.014522 0.009275 0.007872 0.008183 0.007961 0.009821 0.006151 0.008115

MODE 1.1541 1.1567 1.1555 1.1561 1.1567 1.1561 1.1561 1.1561

Consumed Time 3 8 17 37 70 129 219 338

Table C.6: Chromosome Numbers and Fitness Values

Figure C.8: Chromosomes-Time Representation

Figure C.9: Chromosome Numbers and Error Bar By analysing the statistical results, whenever chromosomes number become larger, a different solutions will produced and the finding the optimal solution probability will increased until the chromosome number 125, after that the average produced fitness values alternated between already found fitness values with more consumed time, Deceleration in the fitness values improvement is because a redundant solution produced while the time wasted (See Fig. C.8 and Fig.C.9). Although GA with 125 chromosomes version produce better fitness than GA version with 100 chromosomes, but GA with 100 chromosome is the best parameter because it’s consumed time is in the half of GA with 125 chromosome version.

APPENDIX C.

155

Most of GA with 100 chromosome number results produces the maximum achieved fitness. Its standard deviation is low and in it consumed a reasonable time, therefore the best chromosome number is 100. The selection methods can be observed in their varying degrees of selection pressure. Selection pressure is the intensity with which a GA tends to either eliminate an individual or give it an adaptive advantage.

APPENDIX C.

156

Tournament is a set (pool) of chromosomes randomly selected from the population and the best chromosome from this pool will selected as parent, whenever the pool size increased, the chance to select weak chromosomes will reduced. This section has been experimental the best selection pressure by adjusting the tournament size. The following experiment Table.C.7 displays the achieved fitness value for each pool size. The default Generation Number is 400, chromosome Number is 100, crossover rate is 0.9, and mutation rate is 0.4. Tournament Size 1 2 3 4 5 6 7 8 9 10

AVG 1.14763 1.14693 1.15045 1.14578 1.1462 1.14394 1.14698 1.14665 1.14432 1.14734

MAX 1.1561 1.1561 1.1561 1.1561 1.1561 1.1561 1.1567 1.1561 1.1561 1.1561

MIN 1.1332 1.1328 1.1387 1.1252 1.1252 1.1178 1.133 1.133 1.1326 1.1252

STD 0.006433 0.006629 0.005239 0.009613 0.009676 0.009761 0.006573 0.007141 0.00795 0.008081

MODE 1.1561 1.1561 1.1561 1.1561 1.1561 1.1404 1.1404 1.1561 1.1404 1.1561

Table C.7: Tournament Sizes and Fitness Values

Figure C.10: Tournament sizes and Error Bar

Whenever the pool size increased the chance to select weak chromosomes will reduced, until some extent, after that we have to apply the randomize concept to explore more search space areas. Pool with size 3 achieved the highest fitness value and most of solution with size pool 3 achieved that fitness value.

APPENDIX C.

157

Crossover is reproduction operation used by GA to form a new chromosomes from the selected parents, crossover rate will effect on the evolution speed to reach to the optimal solution, whenever crossover rate increased the reproduction operator will increase, at some point we may have to stop crossover process, we need to copy some chromosomes or to use other Genetic operations without crossover operation influence. The following experiment Table.C.8 displays the achieved fitness value for different crossover rate values. The default Generation Number is 400, chromosome Number is 100, pool size is 3, and mutation rate is 0.4. Crossover Rate 0.5 0.6 0.7 0.8 0.9 1

AVG 1.133 1.13023 1.12651 1.14216 1.14684 1.14412

MAX 1.1567 1.1567 1.1561 1.1561 1.1567 1.1561

MIN 1.0515 1.0507 1.061 1.1231 1.1258 1.1252

STD 0.0320759 0.0302135 0.0319496 0.00956911 0.00945593 0.00943243

MODE 1.1478 1.148 1.1404 1.1478 1.1561 1.1561

Table C.8: Crossover Rates and Fitness Values

Figure C.11: Crossover Rates and Error Bar

By analyzing the statistical results , GA results fitness increased whenever it use crossover operation, copying 10% chromosome without crossover will increase the fitness more than applying crossover totally on the whole population , so crossover with 90% rate is better than totally applying crossover ( rate 100%). Most of GA with 0.9 crossover rate results produces the maximum achieved fitness. Its standard deviation is low and in it consumed a reasonable time, therefore the best crossover rate is 0.9.

APPENDIX C.

158

Mutation is a genetic operator that alters one or more gene values in a chromosome from its initial state. This can result in entirely new gene values being added to the gene pool. With these new gene values, the genetic algorithm may be able to arrive at better solution than was previously possible. Mutation is an important part of the genetic search as help to prevent the population from stagnating at any local optima. Mutation occurs during evolution according to a selected mutation rate. This rate should usually be set fairly low. If it is set to high, the search will turn into a primitive random search. The following experiment Table C.9 displays the achieved fitness value for each mutation rate. The default Generation Number is 400, chromosome Number is 100, pool size is 3, and crossover rate is 0.9. Mutation Rate 0 0.1 0.2 0.3 0.4 0.5

AVG 0.4794 1.13403 1.14706 1.14622 1.15007 1.14345

MAX 0.5512 1.1561 1.1561 1.1561 1.1561 1.1561

MIN 0.4194 1.0878 1.1385 1.1252 1.1385 1.1326

STD 0.045006 0.015045 0.005102 0.008397 0.006734 0.006394

MODE 0.5512 1.1408 1.1478 1.1404 1.1561 1.1404

Table C.9: Mutation Rates and Fitness Values

Figure C.12: Mutation Rate and Error Bar

Mutation rate 0.4 has achieved the highest fitness value, we have to explore sub search space for a while before jumping to other space by mutation operator, so applying mutation operator by 40% rate will achieve better results.

APPENDIX C.

C.4

159

parallel Genetic Algorithm design

1. Thread Creation Algorithm 13: Thread Creation This part Defines multiple threads in the main method by extending the thread , each thread will start its own GA version and return a best chromosome, then there will be comparing between best chromosomes among all threads . pre: nothing post: nothing return :nothing Create array of threads ”th” of size [thread size] Loop until thread size Initialize all threads in array th in PGA Set the name of each thread to (” ”+i) Chromosome Solution Initialize Solution to best solution in th[0] Loop from i = 0 until thread size if Solution fitness value >best fitness in th[i] then set Solution to best chromosome in th[i]

APPENDIX C.

160

2. IndePGA algorithm Algorithm 14: PGA extend Thread This algorithm is for thread object to Creates a new thread and makes it runnable, new thread begins its life inside method Parallel() to execute GA version., then storing the best chromosome among thread populations. pre: array of threads post: setting best chromosome for each thread return :nothing Chromosome Best PGA(String name) Start the thread and make it runnable run() Chromosome Best thread begins its life inside method Parallel() Store best Chromosome for thread in Best Algorithm 15: PGA extend Thread This algorithm is for thread object to Creates a new thread and makes it runnable, new thread begins its life inside method Parallel() to execute GA version., then storing the best chromosome among thread populations. pre: nothing post: Best solution return :best sloution list of population Initialization(population[0]) List of two Chromosome parents List of two Chromosome offspring2 Loop until num population Intalize Current population by the previous one Loop until Crossover rate Select Chromosome parent1 Select Chromosome parent2 Crossover parent1and parent2 and save it in parent Replace the new population with the parent Select a chromes from current population and save it in parent3 Mutete parent3 and save it in first offspring Replace a two chromsomes with offspring2 SelectSolution from population return Sloution

APPENDIX C.

C.4.1

161

Elite Server Algorithm

(a) Migration Algorithm Algorithm 16: Migrate(Population p) this algorithm will copy server’s chromosomes to the subpopulation if its request. pre: Population needs server’s chromosomes to enhance its solution post: changing the worst chromosomes by server’s chromosomes return :nothing Loop until num of chromosomes Sort server’s chromosomes by its fitness Loop until server size Copy server’s chromosomes to the population p

APPENDIX C.

162

(b) Send To Server Algorithm: Algorithm 17: SendToSerever(Chromosome eliteC ) this algorithm for sending best solution from each subpopulation each generation when its change. pre: Selecting best new chromosome(Elite). post: Adding Elite to the server. return :nothing Loop until server size Sort server’s chromosomes by its fitness if eliteC fitness >Server[0] fitness then Copy eliteC to server[0]

APPENDIX C.

163

(c) Subpopulation client Algorithm Algorithm 18: Subpopulation client this algorithm to implement complete version for each subpopulation. pre: nothing. post: each subpopulation complete its task, then select determine best solution from each subpopulation. return :nothing Initialize population Loop for num of generations selection Crossover Mutation Initialize Longevity if Elite is updated then Send Elite to Elite Server else Decrement Longevity-1 if Longevity ==0 then Send request toElite Server. Receive Elite from Elite Server. Replace worst individual with Elite. Initialize Longevity

APPENDIX C.

C.5

164

PGA parameter setting

IndePGA Parameter Setting: IndePGA generations can contribute to explore the search space but can decrease the overall efficiency. Testing and setting the parameter to best parameter can improve the overall efficiency. The following experiment (Table.10) displays the achieved fitness value and the consumed time for each generation number. The default population size is 100, pool size is 4, crossover rate is 0.9, thread size is 8, and mutation rate is 0.25. Generation Size 50 100 150 200 250

AVG 1.00076 1.11963 1.14541 1.1515 1.15479

MAX 1.02289 1.13419 1.15612 1.15612 1.15672

MIN 0.961156 1.09669 1.12883 1.14547 1.14796

STD 0.01754 0.008751 0.006544 0.004053 0.002766

MODE 1.02289 1.13419 1.15612 1.15553 1.15612

TIME 1 2 6 8 11

Table C.10: Generation Numbers and Fitness Values After many experiments the best generation size was 250, so exceeding this value will cause load in memory and consuming more time.

Figure C.13: Generation Numbers and Error Bar By analysing the statistical result , IndePGA results fitness increased until the generation number 250 (See Fig.C.13). Most of PGA with 250 generations results produces fitness more than the average. Its standard deviation is the lowest and in it consumed a reasonable time, therefore the best generation number is 250 within the tested framework.

APPENDIX C.

165

Population size parameter influence on the performance of the IndePGA will be analyzed in the following context from an experimental perspective. Table.C.11 displays the achieved fitness value for each population size. The default Generation Number is 250, pool size is 4, thread size is 12, and mutation rate is 0.25. Population size 25 50 75 100

AVG 1.08632 1.13964 1.14961 1.15494

MAX 1.11763 1.15553 1.15568 1.15672

MIN 1.06903 1.12475 1.14018 1.14784

STD 0.014274 0.008919 0.004714 0.002762

MODE 1.11763 1.15553 1.15568 1.15612

TIME 1 2 4 5

Table C.11: chromosomes numbers and Fitness Values

Figure C.14: chromosomes numbers and Error Bar By analyzing the statistical results, whenever chromosomes number become larger, a different solutions will produced and finding the optimal solution probability will increased .In our observation after chromosome number 100 (See Fig.C.14), Stability in the fitness value appears within the tested framework.

APPENDIX C.

166

Crossover rate is the next experiment that has been evaluated on crossover operation to set its best value. Table.C.12 displays the achieved fitness value for each crossover rate. The default Generation Number is 250, chromosome Number is 100, pool size is 4, thread size is 12, and mutation rate is 0.25. Crossover rate 0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85

AVG 1.15451 1.15434 1.15389 1.15491 1.15555 1.15431 1.15505 1.15603

MAX 1.15672 1.15612 1.15643 1.15612 1.15672 1.15672 1.15612 1.15672

MIN 1.14796 1.14796 1.14796 1.14796 1.14796 1.14721 1.14721 1.15507

STD 0.003114 0.003053 0.003348 0.002783 0.001799 0.003316 0.002604 0.000452

MODE 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612

Table C.12: Crossover Rates and Fitness Values

Figure C.15: Crossover Rates and Error Bar

The crossover rate has been set to 0.85 since the fitness value recorded is best value in to 0.85 within the tested range.

APPENDIX C.

167

Mutation rate parameter influence on the performance of the IndePGA has been evaluated in Table.C.13 from an experimental perspective. The default Generation Number is 400, chromosome Number is 100, pool size is 3, thread size is 12, and crossover rate is 0.9. Mutation rate 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45

AVG 1.08129 1.13397 1.1478 1.15181 1.15532 1.15567 1.15615 1.15615 1.15627

MAX 1.10008 1.15282 1.15672 1.15672 1.15672 1.15672 1.15672 1.15672 1.15672

MIN 1.04356 1.11088 1.13903 1.14002 1.1479 1.14871 1.15538 1.15553 1.15553

STD 0.015961 0.010564 0.005264 0.004605 0.002361 0.001631 0.000314 0.000296 0.00032

MODE 1.10008 1.15282 1.15672 1.14796 1.15612 1.15612 1.15612 1.15612 1.15612

Table C.13: Mutation Rates and Fitness Values

Figure C.16: Mutation Rates and Error Bar The Mutation rate has been set to 0.45 since the fitness value recorded it best value in this range.

APPENDIX C.

168

Tournament size in the following experiment evaluated the best selection pressure by adjusting it. Table.C.14 Displays the achieved fitness value for each tournament size. The default Generation Number is 400, chromosome Number is 100, thread size is 12, crossover rate is 0.9, and mutation rate 0.45. Tournament pool size 1 2 3 4 5 6 7 8 9 10

AVG 1.15347 1.15458 1.15412 1.15568 1.1555 1.15492 1.15397 1.15453 1.15517 1.15448

MAX 1.15612 1.15672 1.15672 1.1567 1.15612 1.15612 1.15672 1.15672 1.15672 1.15672

MIN 1.14694 1.14754 1.14833 1.14858 1.14796 1.14796 1.14784 1.14761 1.14784 1.14116

STD 0.003219 0.003094 0.003253 0.001653 0.001757 0.002281 0.003407 0.003189 0.002381 0.003918

MODE 1.15553 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612

Table C.14: Tournament Size and Fitness Values

Figure C.17: Tournament Sizes and Error Bar

The tournament pool size has been set to 4 since the fitness value recorded. It best value in this range.

APPENDIX C.

169

Thread size in the following experiment Table.C.15 displays the achieved fitness value for each of them. The default Generation Number is 400, chromosome Number is 100, crossover rate is 0.9, 4 pool size, and mutation rate 0.45. Thread size 5 6 7 8 9 10 11 12

AVG 1.15536 1.15496 1.15587 1.15618 1.15576 1.15606 1.15584 1.15587

MAX 1.15672 1.15672 1.15672 1.15672 1.15672 1.15672 1.15672 1.15672

MIN 1.14844 1.14668 1.14871 1.15584 1.14871 1.15538 1.14871 1.14871

STD 0.002328 0.003076 0.001616 0.000195 0.001763 0.000338 0.001683 0.001662

MODE 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612 1.15612

Table C.15: Thread Sizes and Fitness Values

Figure C.18: Thread Sizes and Error Bar The thread size has set to 8 since the fitness value recorded its best value in this range so exceeding this value.

APPENDIX C.

170

(a) Parameters setting for ACoPGA Finding effective settings for parameters need many experiments to decide the best sitting that will enhance the efficiency. In ACoPGA many parameters should be taken in our consideration the first one interval which means the time interval that the client sends its elites. Secondly longevity which is a parameter to control the timing of the request for chromosome from the elite server and finally the number of migrants send by the elite server. The parameters for GA are as follows. Total population size is 300. Crossover is single point crossover for each variable. Crossover rate is 0.95. Mutation is applied at bit level with 0.45 probabilities. (b) Interval: A Subpopulation Client does not send an elite chromosome information at every elite updating, but once in every times α of updates. By increasing α , this system is able to reduce the communication frequency from a Subpopulation Client to Elite Server. interval 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

AVG 1.15494 1.15374 1.15451 1.15492 1.15584 1.15425 1.15537 1.15533 1.15469 1.15572 1.1558 1.15621 1.15612 1.15627 1.15621 1.15624 1.15627 1.15633 1.15627 1.15618 1.1563 1.15627 1.15627 1.1559 1.15624 1.15633 1.1563 1.15621 1.15627 1.15618 Table C.16:

MAX MIN STD MODE 1.15672 1.14784 0.002999 1.15612 1.15672 1.14784 0.003652 1.15612 1.15672 1.14784 0.003247 1.15612 1.15672 1.14784 0.002871 1.15612 1.15672 1.14871 0.001651 1.15612 1.15672 1.14784 0.003397 1.15612 1.15672 1.14796 0.002369 1.15612 1.15612 1.14844 0.002298 1.15612 1.15672 1.14844 0.003104 1.15612 1.15672 1.14871 0.001648 1.15612 1.15672 1.14796 0.001822 1.15612 1.15672 1.15612 0.000213 1.15612 1.15672 1.15553 0.000188 1.15612 1.15672 1.15553 0.00032 1.15612 1.15672 1.15553 0.000341 1.15612 1.15672 1.15612 0.000239 1.15612 1.15672 1.15553 0.000371 1.15612 1.15672 1.15612 0.000285 1.15612 1.15672 1.15553 0.00032 1.15612 1.15672 1.15553 0.00026 1.15612 1.15672 1.15553 0.000332 1.15612 1.15672 1.15553 0.00032 1.15612 1.15672 1.15553 0.00032 1.15612 1.15672 1.14871 0.001681 1.15612 1.15672 1.15612 0.000239 1.15612 1.15672 1.15612 0.000285 1.15612 1.15672 1.15612 0.000274 1.15612 1.15672 1.15553 0.000341 1.15612 1.15672 1.15538 0.000339 1.15612 1.15672 1.15553 0.00026 1.15612 Interval Numbers and Fitness Values

time 17 16 18 18 16 16 16 17 18 17 16 17 16 17 18 17 18 18 18 19 18 18 16 17 16 18 17 16 17 17

APPENDIX C.

171

Figure C.19: Interval Numbers and Fitness Values - Average Representation After every 18 updating for the elite the clients will send to the server the elites since its recorded its best average value as shown in Table.C.16.

APPENDIX C.

172

(c) longevity The Longevity parameter (γ) is to control the timing of the request for chromosome import from elite server. To simplify the implementation, γ is decremented by 1 when, a generation, elite is not updated. And the migration occurs when γ is equal to 0. When elite is updated, γ is reset to the initial value. Thus, when GA searches well within a Subpopulation client, it does not need to send migration request to elite server and the communication frequency is become less. longevity AVG 1 1.15621 2 1.15624 3 1.15636 4 1.15624 5 1.1563 6 1.15636 7 1.15627 8 1.15587 9 1.15633 10 1.15632 11 1.15621 12 1.15623 13 1.15618 14 1.15636 15 1.15633 16 1.15624 17 1.15621 18 1.15628 19 1.15621 20 1.15621 Table C.17:

MAX MIN STD MODE 1.15672 1.15612 0.000213 1.15612 1.15672 1.15612 0.000239 1.15612 1.15672 1.15612 0.000293 1.15612 1.15672 1.15612 0.000239 1.15612 1.15672 1.15612 0.000274 1.15612 1.15672 1.15612 0.000293 1.15612 1.15672 1.15612 0.000259 1.15612 1.15672 1.14871 0.001672 1.15612 1.15672 1.15553 0.000341 1.15612 1.15672 1.15612 0.000272 1.15612 1.15672 1.15612 0.000213 1.15612 1.15672 1.15538 0.00039 1.15612 1.15672 1.15612 0.000179 1.15612 1.15672 1.15612 0.000293 1.15612 1.15672 1.15538 0.000407 1.15672 1.15672 1.15553 0.000304 1.15612 1.15672 1.15553 0.000284 1.15612 1.15672 1.15553 0.000353 1.15612 1.15672 1.15538 0.000358 1.15612 1.15672 1.15612 0.000213 1.15612 Initial Longevity and Fitness Values

time 20 18 18 18 17 18 18 18 18 18 18 18 18 19 20 19 18 18 18 18

Figure C.20: Initial Longevity and Fitness Values - Average Representation The average value of longevity( 6 ) outperformed all the other values as shown in Table.C.17

APPENDIX C.

173

(d) Server Size As described before, the migration occurs when longevity Parameter in a Subpopulation Client is decremented to 0. The Subpopulation Client requests to Elite Server for migrants and Elite Server sends µ chromosome (Server Size) information to the Subpopulation Client. Individuals selected randomly in the Subpopulation Client are replaced by newly received individuals. To ensure traffic kept low when the number of migrants (µ) is small. migration AVG MAX MIN STD MODE 1 1.15596 1.15672 1.14871 0.001688 1.15612 2 1.15627 1.15672 1.15553 0.00032 1.15612 3 1.15624 1.15672 1.15612 0.000239 1.15612 4 1.1563 1.15672 1.15612 0.000274 1.15612 5 1.15624 1.15672 1.15553 0.000304 1.15612 6 1.1558 1.15672 1.14784 0.001847 1.15612 7 1.15579 1.15672 1.14884 0.00161 1.15612 8 1.15633 1.15672 1.15553 0.000341 1.15612 9 1.15627 1.15672 1.15612 0.000259 1.15612 10 1.1563 1.15672 1.15612 0.000274 1.15612 11 1.15627 1.15672 1.15612 0.000259 1.15612 12 1.15627 1.15672 1.15612 0.000259 1.15612 13 1.15618 1.15672 1.15553 0.00026 1.15612 14 1.15556 1.15672 1.14871 0.002301 1.15612 15 1.15627 1.15672 1.15612 0.000259 1.15612 16 1.15621 1.15672 1.15553 0.000284 1.15612 17 1.15552 1.15672 1.14844 0.002337 1.15612 18 1.15627 1.15672 1.15612 0.000259 1.15612 19 1.15624 1.15672 1.15553 0.000304 1.15612 20 1.15592 1.15672 1.14844 0.001737 1.15612 Table C.18: Number of migrants and Fitness Values

time 17 17 16 16 18 16 16 18 16 16 16 15 16 17 17 16 16 16 16 16

Figure C.21: Number of migrants and Fitness Values - Average Representation So the server size will be set to 10 after donning some experiments this value outperformed all other values within the tested range see Figure.C.21

APPENDIX C.

C.5.0.1

174

Synchronous Migration Algorithm

Algorithm 19: Migration () this algorithm migrate the Chromosome from one neighbor to other to explore the search space and reach the global optima. pre: nothing. post: change the one thread population Chromosomes with its neighbour best Chromosomes return :nothing if population number % migration interval== 0 then List of n best Chromosomes = Thread[i].MAX(population [i],migration rate) if the neighbor thread is alive then Replace worst (thread[i+1], best Chromosomes)

APPENDIX C.

175

1. Parameters setting for SCoPGA: Finding effective settings for parameters need many experiments to decide the best sitting that will enhance the efficiency. In SGA many parameters should be taken in our consideration the first one is migration interval which means the time interval that the node send its best n chromosome to it’s neighbour. Secondly Migration rate which is a parameter to control the number of best chromosomes send to the neighbour. The parameters for GA are as follows. Total population size is 300. Crossover is single point crossover for each variable. Crossover rate is 0.95. Mutation is applied at bit level with 0.45 probabilities. 2. Migration interval: A neighbour node does not send n best chromosomes information at every generation, but once after (α) period. By increasing α , the information sharing will decrease and the communication overhead will decreased at the same time. migration interval 25 50

AVG 1.15225 1.15409

MAX 1.15672 1.15672

MIN 1.14017 1.14043

STD 0.004694 0.003955

MODE 1.15612 1.15612

consumed time 7 7

Table C.19: Migration interval and Fitness Values Representation

From the statistical results, there is no too different between the 25 and 50 period in term of achieved fitness and consumed time, but 50 rate is almost better see Table.C.19.

Figure C.22: Migration interval and Fitness Values Representation

APPENDIX C.

176

3. Migration rate: Each (α) period, node n will migrate its best n chromosomes to the next node, increasing the number of migrated chromosomes will enhance the information sharing but will reduce the results diversity. Migration rate 5 10 15 20 25

AVG 1.15371 1.15384 1.1537 1.15266 1.15388

MAX 1.15672 1.16467 1.15672 1.15672 1.15672

MIN 1.14784 1.14078 1.13861 1.14043 1.14043

STD 0.003648 0.004432 0.004859 0.004939 0.003855

MODE 1.15612 1.15612 1.15612 1.15612 1.15612

consumed time 8 8 10 7 6

Table C.20: Migration rate and Fitness Values Representation From the statistical results, the migration rate set to 25 the consumed time and the achieved fitness recorded the highest values within the tested framework see table.C.20.

Figure C.23: Migration rate and Fitness Values Representation

177

Appendix D

D.1

Intelligent Assessment System Test

D.1.1

Engine Subsystem

1. Initialization: Short Description: In initialization, individuals are created by assign a random teacher from qualified teacher list to each section randomly by considering only hard constraint. Input

Expected output

Output

individuals are created randomly

Initial population of random created chromosomes that considering hard constraint.

Course= OS Section: 271 Teacher= Shahad Alqefari, AC:3, AH:15

Result

Course= OS Section: 272 Teacher= Shahad Alqefari, AC:3, AH:15 Course= PL Section: 271 Teacher= Rola Alorini, AC:2, AH:7 Course= PL Section: 272 Teacher= Sarah Alonzi, AC:3, AH:15 Course= c++ Section: 271 Teacher= Rola Alorini, AC:2, AH:7 Course= c++ Section: 272 Teacher= Sarah Alonzi, AC:3, AH:15 Course= PM Section: 271 Teacher= Fatmah Jaddoh, AC:2, AH:12 Course= PM Section: 272 Teacher= Fatmah Jaddoh, AC:2, AH:12 Course= SE Section: 271 Teacher= Fatmah Jaddoh, AC:2, AH:12 Course= SE Section: 272 Teacher= Fatmah Jaddoh, AC:2, AH:12 Course= IP Section: 271 Teacher= Ftoon, AC:1, AH:0 Course= IP Section: 272 Teacher= Ftoon, AC:1, AH:0 Course= MN Section: 271 Teacher= Shahad Alqefari, AC:3, AH:15 Course= MN Section: 272 Teacher= Entesar Alonzi, AC:1, AH:3 Course= NS Section: 271 Teacher= Ebtehal Turki, AC:2, AH:10 Course= NS Section: 272 Teacher= Ebtehal Turki, AC:2, AH:10 Course= Compiler Section: 271 Teacher= Sarah Alonzi, AC:3, AH:15 Course= Compiler Section: 272 Teacher= Sarah Alonzi, AC:3, AH:15 Course= DataStructure Section: 271 Teacher= May Fahad, AC:1, AH:8 Course= DataStructure Section: 272 Teacher= May Fahad, AC:1, AH:8 Course= Network Section: 271 Teacher= Ebtehal Turki, AC:2, AH:10 Course= Network Section: 272 Teacher= Shahad Alqefari, AC:3, AH:15

correct

APPENDIX D.

178

2. Fitness: Short Description: For computing the objective function for many factors including teacher preferences considerateness, experience and fairness between teachers according to assigned courses, assigned hours and satisfaction for each teachers preferences. Input

Expected output

Output

Result

Compute fitness for a chromosome: Course= C++1 Section: 271 Teacher= Eiman Fahad, AC:1, AH:4 Course= C++1 Section: 272 Teacher= Nofe Adel, AC:3, AH:12 Course= C++1 Section: 273 Teacher= Hend Saleh, AC:1, AH:4 Course= C++1 Section: 274 Teacher= Nesrin Majed, AC:2, AH:12 Course= C++1 Section: 275 Teacher= Thoria Khalid, AC:2, AH:7 Course= Descrite Structure Section: 271 Teacher= Samah Saab, AC:3, AH:11 Course= Descrite Structure Section: 272 Teacher= Hend Naser, AC:3, AH:11 Course= Descrite Structure Section: 273 Teacher= Nofe Adel, AC:3, AH:12 Course= Descrite Structure Section: 274 Teacher= Huda Naser, AC:1, AH:4 Course= C++2 Section: 271 Teacher= Layla Saleh, AC:1, AH:4 Course= C++2 Section: 272 Teacher= Soha Fahad, AC:3, AH:12 Course= C++2 Section: 273 Teacher= Nora Saleh, AC:2, AH:10 Course= C++2 Section: 274 Teacher= Khawlah Abdalaziz, AC:2, AH:7 Course= C++2 Section: 275 Teacher= Samah Saab, AC:3, AH:11 Course= Digital Logic Section: 271 Teacher= Soha Fahad, AC:3, AH:12 Course= Digital Logic Section:

272 Teacher= Wojdan Mo-

hammed, AC:2, AH:8 Course= Data Structure Section: 271 Teacher= Soha Fahad, AC:3, AH:12 Course= Data Structure Section: 272 Teacher= Entesar Alonzy, AC:2, AH:7 Course= Data Structure Section: 273 Teacher= Sara Abdullah, AC:3, AH:11 Course= Data Structure Section: 274 Teacher= Amal Mohhamed, AC:2, AH:7 Course= Assembly Section: 271 Teacher= Maha Khalid, AC:2, AH:7 Course= Assembly Section: 272 Teacher= Khawlah Abdalaziz, AC:2, AH:7 Course= Software Engineering Section: 271 Teacher= Ebtehal Turki, AC:1, AH:6 Course= Software Engineering Section: 272 Teacher= Ebtehal Turki, AC:1, AH:6 Course= Algorithm Section: 271 Teacher= Montaha Ali, AC:1, AH:6 Course= Algorithm Section: 272 Teacher= Montaha Ali, AC:1, AH:6 Course= Architecture Section: 271 Teacher= Najla Bandr, AC:1, AH:4

value for this chromosome Compute accurate fitness

correct

APPENDIX D.

Course= Architecture Section: 272 Teacher= Hend Naser, AC:3, AH:11 Course= Operating System Section: 271 Teacher= Maha Khalid, AC:2, AH:7 Course= Operating System Section: 272 Teacher= Manahel Ali, AC:1, AH:4 Course= Operating System Section: 273 Teacher= Wojdan Mohammed, AC:2, AH:8 Course= Operating System Section: 274 Teacher= Sara Naif, AC:1, AH:4 Course= Database Section: 271 Teacher= Jwaher Ali, AC:1, AH:3 Course= Database Section: 272 Teacher= Samah Saab, AC:3, AH:11 Course= Programming Language Section: 271 Teacher= Mona Hamed, AC:2, AH:12 Course= Programming Language Section: 272 Teacher= Bedriah Hamed, AC:2, AH:5 Course= Network Section: 271 Teacher= Sara Abdullah, AC:3, AH:11 Course= Network Section: 272 Teacher= Ahlam Ali, AC:1, AH:8 Course= Network Section: 273 Teacher= Nofe Adel, AC:3, AH:12 Course= Network Section: 274 Teacher= Ahlam Ali, AC:1, AH:8 Course= Image Prcessing Section: 271 Teacher= Nora Saleh, AC:2, AH:10 Course= Image Prcessing Section: 272 Teacher= Nora Saleh, AC:2, AH:10 Course= Seminar Section: 271 Teacher= Bedriah Hamed, AC:2, AH:5 Course= Seminar Section: 272 Teacher= Bedriah Hamed, AC:2, AH:5 Course= Artificial Intelligence Section: 271 Teacher= Hend Naser, AC:3, AH:11 Course= Artificial Intelligence Section: 272 Teacher= Entesar Alonzy, AC:2, AH:7 Course= Network Security Section: 271 Teacher= Thoria Khalid, AC:2, AH:7 Course= Network Security Section: 272 Teacher= Nadia Hasen, AC:1, AH:3 Course= Mobile Network Section: 271 Teacher= Mona Hamed, AC:2, AH:12 Course= Mobile Network Section: 272 Teacher= Mona Hamed, AC:2, AH:12 Course= Mobile Network Section: 273 Teacher= Mona Hamed, AC:2, AH:12 Course= Compiler Section: 271 Teacher= Nesrin Majed, AC:2, AH:12 Course= Compiler Section: 272 Teacher= Nesrin Majed, AC:2, AH:12 Course= Project Management Section: 271 Teacher= Amal Mohhamed, AC:2, AH:7

179

APPENDIX D.

180

Course= Project Management Section: 272 Teacher= Sara Abdullah, AC:3, AH:11 Course= Arabization Section: 271 Teacher= Hessa Abdullah, AC:1, AH:6 Course= Arabization Section: 272 Teacher= Hessa Abdullah, AC:1, AH:6

Compute fitness for another chromosome

Compute fitness for a given chromosome

correct

APPENDIX D.

181

3. Selection: Short Description:Selecting best n individuals from a population of individual, n is the size of pool to select from it two individuals to apply genetic operators on it. Input

Expected output

Output

Result

Best chromosome is chromosome3 with fitness=4

Selected Fitness:4.0

correct

Select best chromosome from pool size 4 chromosome0 :Fitness:1.0 chromosome1 :Fitness:2.0 chromosome2 :Fitness:3.0 chromosome3 :Fitness:4.0

Select best chromosome from pool size 3 and population size 9

Best chromosome ness=9.233332

fit-

Selected Fitness:9.233332

correct

Select best chromosome from pool size 2 and population size 9

Best chromosome ness=9.155554

fit-

Selected Fitness:9.155554

correct

APPENDIX D.

182

4. Crossover: Short Description:One of genetic operators that take two parents (chromosomes) and generate 2 new offspring by using one-point crossover to discover all the search space. Input

Expected output

One point Crossover

generate two new offspring of given parents. swap 2 parent at crossover point

parent1: section1:Teacher: Entesar, Course: DB. section2:Teacher: Entesar, Course: DB.

Output

Result

Chromosome.1: section1:Teacher: Entesar, Course: DB.

correct

section2:Teacher: Shahad, Course: DB.

section3:Teacher: Fatimah, Course: PL.

section3:Teacher: Montaha, Course: PL.

section4:Teacher: Fatimah, Course: C++.

section5:Teacher: Fatimah, Course: OS.

section5:Teacher: Fatimah, Course: OS.

section6:Teacher: Shahad, Course: AI.

section6:Teacher: Shahad, Course: AI.

section7:Teacher: Entesar, Course: AI.

section7:Teacher: Entesar, Course: AI.

parent2:

Chromosome.2:

section1:Teacher: Entesar, Course: DB.

section1:Teacher: Entesar, Course: DB.

section2:Teacher: Shahad, Course: DB.

section2:Teacher: Entesar, Course: DB.

section3:Teacher: Montaha, Course: PL.

section3:Teacher: Fatimah, Course: PL.

section4:Teacher: Shahad, Course: C++.

section4:Teacher: Fatimah, Course: C++.

section5:Teacher: Entesar, Course: OS.

section5:Teacher: Entesar, Course: OS.

section6:Teacher: Entesar, Course: AI.

section6:Teacher: Entesar, Course: AI.

section7:Teacher: Entesar, Course: AI.

parent1: section1:Teacher: Shahad, Course: DB. section2:Teacher: Entesar, Course: DB. section3:Teacher: Montaha, Course: PL.

section7:Teacher: Entesar, Course: AI.

generate two new offspring of given parents. swap 2 parent at crossover point.

Chromosome.1: section1:Teacher: Entesar, Course: DB. section2:Teacher: Entesar, Course: DB. section3:Teacher: Montaha, Course: PL.

section4:Teacher: Shahad, Course: C++.

section4:Teacher: Shahad, Course: C++.

section5:Teacher: Fatimah, Course: OS.

section5:Teacher: Fatimah, Course: OS.

section6:Teacher: Entesar, Course: AI.

section6:Teacher: Entesar, Course: AI.

section7:Teacher: Entesar, Course: AI.

section7:Teacher: Entesar, Course: AI.

parent2:

Chromosome.2:

section1:Teacher: Entesar, Course: DB.

section1:Teacher: Shahad, Course: DB

section2:Teacher: Shahad, Course: DB.

section2:Teacher: Shahad, Course: DB.

section3:Teacher: Montaha, Course: PL.

section3:Teacher: Montaha, Course: PL.

section4:Teacher: Shahad, Course: C++.

section4:Teacher: Shahad, Course: C++.

section5:Teacher: Entesar, Course: OS.

section5:Teacher: Entesar, Course: OS.

section6:Teacher: Ebtehal, Course: AI.

section6:Teacher: Ebtehal, Course: AI.

section7:Teacher: Shahd, Course: AI.

section7:Teacher: Shahd, Course: AI

correct

APPENDIX D.

183

5. Mutation: Short Description:Operator that takes a chromosome as an input mutates a point and replaces its assigned teacher by other qualified teacher for this course. Input

Expected output

Output

Teachers:

Select a random section,

Selected course:MN, section: 272

Teacher Name= Shahad Alqefari AC:3 AH:14

select a random qualified

Current Teacher: Entesar Alonzi, Rendomly

Teacher Name= Fatmah Jaddoh AC:2 AH:12

teacher for this course and

Selected Teacher: Ebtehal Turki

Teacher Name= Entesar Alonzi AC:2 AH:7

assign him to this section,

Result:

Teacher Name= Sarah Alonzi AC:1 AH:8

finally make the required

Teachers:

Teacher Name= Ftoon AC:1 AH:0

computation (Assigned hour,

Teacher Name= Shahad Alqefari AC:3 AH:14

Teacher Name= Montaha Abalkhel AC:0 AH:0

Assigned course) correctly.

Teacher Name= Fatmah Jaddoh AC:2 AH:12

Teacher Name= Rola Alorini AC:1 AH:3

Teacher Name= Entesar Alonzi AC:1 AH:4

Teacher Name= Ebtehal Turki AC:4 AH:18

Teacher Name= Sarah Alonzi AC:1 AH:8

Teacher Name= May Fahad AC:1 AH:8

Teacher Name= Ftoon AC:1 AH:0

Assigned Sections:

Teacher Name= Montaha Abalkhel AC:0 AH:0

Course: OS, Section: 271, Teacher: Shahad Alqefari

Teacher Name= Rola Alorini AC:1 AH:3

Course: OS, Section: 272, Teacher: Shahad Alqefari

Teacher Name= Ebtehal Turki AC:5 AH:21

Course: PL, Section: 271, Teacher: Ebtehal Turki

Teacher Name= May Fahad AC:1 AH:8

Course: PL, Section: 272, Teacher: Rola Alorini

Assigned Sections:

Course: c++, Section: 271, Teacher: Ebtehal Turki

Course: OS, Section: 271, Teacher: Shahad Alqefari

Course: c++, Section: 272, Teacher: Ebtehal Turki

Course: OS, Section: 272, Teacher: Shahad Alqefari

Course: PM, Section: 271, Teacher: Fatmah Jaddoh

Course: PL, Section: 271, Teacher: Ebtehal Turki

Course: PM, Section: 272, Teacher: Fatmah Jaddoh

Course: PL, Section: 272, Teacher: Rola Alorini

Course: SE, Section: 271, Teacher: Fatmah Jaddoh

Course: c++, Section: 271, Teacher: Ebtehal Turki

Course: SE, Section: 272, Teacher: Fatmah Jaddoh

Course: c++, Section: 272, Teacher: Ebtehal Turki

Course: IP, Section: 271, Teacher: Ftoon

Course: PM, Section: 271, Teacher: Fatmah Jaddoh

Course: IP, Section: 272, Teacher: Ftoon

Course: PM, Section: 272, Teacher: Fatmah Jaddoh

Course: MN, Section: 271, Teacher: Shahad Alqefari

Course: SE, Section: 271, Teacher: Fatmah Jaddoh

Course: MN, Section: 272, Teacher: Entesar Alonzi

Course: SE, Section: 272, Teacher: Fatmah Jaddoh

Course: NS, Section: 271, Teacher: Ebtehal Turki

Course: IP, Section: 271, Teacher: Ftoon

Course: NS, Section: 272, Teacher: Shahad Alqefari

Course: IP, Section: 272, Teacher: Ftoon

Course: Compiler, Section: 271, Teacher: Sarah Alonzi

Course: MN, Section: 271, Teacher: Shahad Alqefari

Course: Compiler, Section: 272, Teacher: Sarah Alonzi

Course: MN, Section: 272, Teacher: Ebtehal Turki

Course: DataStructure, Section: 271, Teacher: May Fahad

Course: NS, Section: 271, Teacher: Ebtehal Turki

Course: DataStructure, Section: 272, Teacher: May Fahad

Course: NS, Section: 272, Teacher: Shahad Alqefari

Course: Network, Section: 271, Teacher: Ebtehal Turki

Course: Compiler, Section: 271, Teacher: Sarah Alonzi

Course: Network, Section: 272, Teacher: Entesar Alonzi

Course: Compiler, Section: 272, Teacher: Sarah Alonzi

Result correct

Course: DataStructure, Section: 271, Teacher: May Fahad Course: DataStructure, Section: 272, Teacher: May Fahad Course: Network, Section: 271, Teacher: Ebtehal Turki Course: Network, Section: 272, Teacher: Entesar Alonzi Parent:

Select a random section,

Selected course:MN, section: 272

Teachers:

select a random qualified

Current Teacher: Shahad Alqefari, Rendomly

Teacher Name= Shahad Alqefari AC:3 AH:18

teacher for this course and

Selected Teacher: Ebtehal Turki

Teacher Name= Fatmah Jaddoh AC:4 AH:19

assign him to this section,

Teachers:

Teacher Name= Entesar Alonzi AC:1 AH:3

finally make the required

Teacher Name= Shahad Alqefari AC:3 AH:15

Teacher Name= Sarah Alonzi AC:3 AH:15

computation (Assigned hour,

Teacher Name= Fatmah Jaddoh AC:4 AH:19

Teacher Name= Ftoon AC:1 AH:0

Assigned course) correctly.

Teacher Name= Entesar Alonzi AC:1 AH:3

Teacher Name= Montaha Abalkhel AC:0 AH:0

Teacher Name= Sarah Alonzi AC:3 AH:15

Teacher Name= Rola Alorini AC:0 AH:0

Teacher Name= Ftoon AC:1 AH:0

Teacher Name= Ebtehal Turki AC:2 AH:7

Teacher Name= Montaha Abalkhel AC:0 AH:0

correct

APPENDIX D.

184

Teacher Name= May Fahad AC:1 AH:8

Teacher Name= Rola Alorini AC:0 AH:0

Assigned Sections:

Teacher Name= Ebtehal Turki AC:3 AH:10

Course: OS, Section: 271, Teacher: Shahad Alqefari

Teacher Name= May Fahad AC:1 AH:8

Course: OS, Section: 272, Teacher: Shahad Alqefari

Assigned Sections:

Course: PL, Section: 271, Teacher: Fatmah Jaddoh

Course: OS, Section: 271, Teacher: Shahad Alqefari

Course: PL, Section: 272, Teacher: Sarah Alonzi

Course: OS, Section: 272, Teacher: Shahad Alqefari

Course: c++, Section: 271, Teacher: Sarah Alonzi

Course: PL, Section: 271, Teacher: Fatmah Jaddoh

Course: c++, Section: 272, Teacher: Fatmah Jaddoh

Course: PL, Section: 272, Teacher: Sarah Alonzi

Course: PM, Section: 271, Teacher: Fatmah Jaddoh

Course: c++, Section: 271, Teacher: Sarah Alonzi

Course: PM, Section: 272, Teacher: Fatmah Jaddoh

Course: c++, Section: 272, Teacher: Fatmah Jaddoh

Course: SE, Section: 271, Teacher: Fatmah Jaddoh

Course: PM, Section: 271, Teacher: Fatmah Jaddoh

Course: SE, Section: 272, Teacher: Fatmah Jaddoh

Course: PM, Section: 272, Teacher: Fatmah Jaddoh

Course: IP, Section: 271, Teacher: Ftoon

Course: SE, Section: 271, Teacher: Fatmah Jaddoh

Course: IP, Section: 272, Teacher: Ftoon

Course: SE, Section: 272, Teacher: Fatmah Jaddoh

Course: MN, Section: 271, Teacher: Shahad Alqefari

Course: IP, Section: 271, Teacher: Ftoon

Course: MN, Section: 272, Teacher: Shahad Alqefari

Course: IP, Section: 272, Teacher: Ftoon

Course: NS, Section: 271, Teacher: Entesar Alonzi

Course: MN, Section: 271, Teacher: Shahad Alqefari

Course: NS, Section: 272, Teacher: Ebtehal Turki

Course: MN, Section: 272, Teacher: Ebtehal Turki

Course: Compiler, Section: 271, Teacher: Sarah Alonzi

Course: NS, Section: 271, Teacher: Entesar Alonzi

Course: Compiler, Section: 272, Teacher: Sarah Alonzi

Course: NS, Section: 272, Teacher: Ebtehal Turki

Course: DataStructure, Section: 271, Teacher: May Fahad

Course: Compiler, Section: 271, Teacher: Sarah Alonzi

Course: DataStructure, Section: 272, Teacher: May Fahad

Course: Compiler, Section: 272, Teacher: Sarah Alonzi

Course: Network, Section: 271, Teacher: Shahad Alqefari

Course: DataStructure, Section: 271, Teacher: May Fahad

Course: Network, Section: 272, Teacher: Ebtehal Turki

Course: DataStructure, Section: 272, Teacher: May Fahad Course: Network, Section: 271, Teacher: Shahad Alqefari Course: Network, Section: 272, Teacher: Ebtehal Turki

APPENDIX D.

185

6. Replacement: Short Description:Function to compare between parent and children to replace the worst (n) parents only if the new children is better. Input 4 chromosomes, First Parent: Fitness :0.40547717 Second Parent: Fitness :0.37823796 First Child: Fitness :0.29755926 Second Child: Fitness :0.29498288 4 chromosomes, First Parent: 0.22641051 Second Parent: 0.2907737 First Child: 0.18654612 Second Child: 0.28115562

Expected output Select the 2 best chromosomes which are: first parent and, second parent

Select the 2 best chromosomes which are: Second parent and, Second Child.

Output

Result correct

correct

APPENDIX D.

186

7. Validation: Short Description:Validation will apply a penalty function if a chromosome violates the upper load hard constraint by assign hours to teacher more than his upper load. Input

Expected output

Output

Result

Teacher: ShahadSaleh Assigned hourse:0

There is no infeasible assigned for each teacher in this chromosome, So, Fitness value remains the same.

Fitness value: 0.7556339 After Validation: Fitness value: 0.7556339

correct

and the upper load :12 Teacher: Fatmah Mohammed Assigned hourse:10 and the upper load :12 Teacher: EbtehalTurki Assigned hourse:7 and the upper load :12 Teacher: Montaha Ali Assigned hourse:6 and the upper load :12 Teacher: EntesarAlonzy Assigned hourse:7 and the upper load :12 Teacher: HendNaser Assigned hourse:7 and the upper load :12 Teacher: Maha Khalid Assigned hourse:3 and the upper load :12 Teacher: EimanFahad Assigned hourse:0 and the upper load :12 Teacher: Sara Abdullah Assigned hourse:9 and the upper load :12 Teacher: Nofe Adel Assigned hourse:12 and the upper load :12 Teacher: Ahlam Ali Assigned hourse:7 and the upper load :12 Teacher: Mona Hamed Assigned hourse:10 and the upper load :12 Teacher: Jwaher Ali Assigned hourse:8 and the upper load :12 Teacher: BedriahHamed Assigned hourse:7 and the upper load :12 Teacher: Nora Saleh Assigned hourse:3 and the upper load :12 Teacher: Manahel Ali Assigned hourse:7 and the upper load :12 Teacher: Thoria Khalid Assigned hourse:11 and the upper load :12 Teacher: Huda Naser Assigned hourse:4 and the upper load :12 Teacher: Nadia Hasen Assigned hourse:11 and the upper load :12 Teacher: NesrinMajed Assigned hourse:11 and the upper load :12 Teacher: SohaFahad Assigned hourse:8 and the upper load :12 Teacher: Wojdan Mohammed Assigned hourse:4 and the upper load :12 Teacher: HendSaleh Assigned hourse:0 and the upper load :12 Teacher: KhawlahAbdalaziz Assigned hourse:3

APPENDIX D.

187

and the upper load :12 Teacher: LaylaSaleh Assigned hourse:12 and the upper load :12 Teacher: Sara Naif Assigned hourse:7 and the upper load :12 Teacher: Hessa Abdullah Assigned hourse:7 and the upper load :12 Teacher: AmalMohhamed Assigned hourse:7 and the upper load :12 Teacher: Samah Saab Assigned hourse:3 and the upper load :12 Teacher: NajlaBandr Assigned hourse:8 and the upper load :12 Teacher: ShahadSaleh Assigned hourse:0 and the upper load :12 Teacher: Fatmah Mohammed Assigned hourse:10 and the upper load :12 Teacher: EbtehalTurki Assigned hourse:4 and the upper load :12 Teacher: Montaha Ali Assigned hourse:3 and the upper load :12 Teacher: EntesarAlonzy Assigned hourse:6 and the upper load :12 Teacher: HendNaser Assigned hourse:7 and the upper load :12 Teacher: Maha Khalid Assigned hourse:4 and the upper load :12 Teacher: EimanFahad Assigned hourse:4 and the upper load :12 Teacher: Sara Abdullah Assigned hourse:10 and the upper load :12 Teacher: Nofe Adel Assigned hourse:7 and the upper load :12 Teacher: Ahlam Ali Assigned hourse:12 and the upper load :12 Teacher: Mona Hamed Assigned hourse:14 and the upper load :12 Teacher: Jwaher Ali Assigned hourse:12 and the upper load :12 Teacher: BedriahHamed Assigned hourse:7 and the upper load :12 Teacher: Nora Saleh Assigned hourse:6 and the upper load :12 Teacher: Manahel Ali Assigned hourse:7 and the upper load :12 Teacher: Thoria Khalid Assigned hourse:4 and the upper load :12 Teacher: Huda Naser Assigned hourse:3 and the upper load :12 Teacher: Nadia Hasen Assigned hourse:4 and the upper load :12

The value of fitness will be change as this equation: Fitness new Value = (teacher. assigned Hour - teacher. Upper Load) /num section)) oldest value of Fitness

Fitness value: 0.42697915 After Validation: Fitness value: 0.39189142

correct

APPENDIX D.

Teacher: NesrinMajed Assigned hourse:4 and the upper load :12 Teacher: SohaFahad Assigned hourse:7 and the upper load :12 Teacher: Wojdan Mohammed Assigned hourse:8 and the upper load :12 Teacher: HendSaleh Assigned hourse:7 and the upper load :12 Teacher: KhawlahAbdalaziz Assigned hourse:3 and the upper load :12 Teacher: LaylaSaleh Assigned hourse:4 and the upper load :12 Teacher: Sara Naif Assigned hourse:8 and the upper load :12 Teacher: Hessa Abdullah Assigned hourse:4 and the upper load :12 Teacher: AmalMohhamed Assigned hourse:11 and the upper load :12 Teacher: Samah Saab Assigned hourse:12 and the upper load :12 Teacher: NajlaBandr Assigned hourse:7 and the upper load :12

188

APPENDIX D.

189

8. Select Solution: Short Description:Select the best population that contains the highest chromosome’s fitness value among all population set it the best solution. Input Array of populations

Expected output

=Test Select Solution Method=

P P[0].best.fitness =2.0 P[1].best.fitness =8.0 P[2].best.fitness =6.0

Array of populations P P[0].best.fitness =0.0 P[1].best.fitness =3.0 P[2].best.fitness =6.0 P[3].best.fitness =9.0 P[4].best.fitness =12.0 P[5].best.fitness =15.0 P[6].best.fitness =18.0

Output

Chromosome with the best fitness: P[1].best Chromosome with the best fitness: P[6].best

Result correct

Solution : 8.0 Population: 2.0 , 8.0, 6.0

=Test Select Solution Method=

Solution : 18.0 Population: 0.0, 3.0, 6.0, 9.0 , 12.0, 15.0, 18.0

correct

APPENDIX D.

190

9. Preprocessing: Short Description:There is a preprocessing step that includes associating sections to their course and set their id, and assigns all qualified teachers for each course. Input

Expected output

Output

Course Name: C++1

Associating sections

Course Name: C++1

# Section: 5

to their course and

# Section: 5 , Course sections:

—————–

set their id.

271, 272, 273, 274, 275,

Course Name: Descrite Structure

Assign all qualified

Course Name: Descrite Structure

# Section: 4

teachers for each

# Section: 4 , Course sections:

—————–

course according to

271, 272, 273, 274,

Course Name: C++2

their specialization

Course Name: C++2

# Section: 5

and degree.

# Section: 5 , Course sections:

—————–

271, 272, 273, 274, 275,

Course Name: Digital Logic

Course Name: Digital Logic

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Data Structure

Course Name: Data Structure

# Section: 4

# Section: 4 , Course sections:

—————–

271, 272, 273, 274,

Course Name: Assembly

Course Name: Assembly

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Software Engineering

Course Name: Software Engineering

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Algorithm

Course Name: Algorithm

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Architecture

Course Name: Architecture

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Operating System

Course Name: Operating System

# Section: 4

# Section: 4 , Course sections:

—————–

271, 272, 273, 274,

Course Name: Database

Course Name: Database

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Programming Language

Course Name: Programming Language

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Network

Course Name: Network

# Section: 4

# Section: 4 , Course sections:

—————–

271, 272, 273, 274,

Course Name: Image Processing

Course Name: Image Processing

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272

Course Name: Seminar:

Course Name: Seminar

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Artificial Intelligence

Course Name: Artificial Intelligence

# Section: 2

# Section: 2 , Course sections:

Result correct

APPENDIX D.

191

—————–

271, 272,

Course Name: Network Security

Course Name: Network Security

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Mobile Network

Course Name: Mobile Network

# Section: 3

# Section: 3 , Course sections:

—————–

271, 272, 273,

Course Name: Compiler

Course Name: Compiler

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Project Management

Course Name: Project Management

# Section: 2

# Section: 2 , Course sections:

—————–

271, 272,

Course Name: Arabization

Course Name: Arabization

# Section: 2

# Section: 2 , Course sections

—————-

271, 272,

Teachers :

Qulified Teacher for each course:

Shahad Saleh

C++1, qualified teachers are:

Fatmah Mohammed

[Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar

Ebtehal Turki

Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara

Montaha Ali

Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali

, Entesar Alonzy

Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda

Hend Naser

Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan

Maha Khalid

Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara

Eiman Fahad

Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla

Sara Abdullah

Bandr]

Nofe Adel

Descrite Structure, qualified teachers are:

Ahlam Ali

[Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar

Mona Hamed

Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara

Jwaher Ali

Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali,

Bedriah Hamed

Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda

Nora Saleh

Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan

Manahel Ali

Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara

Thoria Khalid

Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla

Huda Naser

Bandr]

Nadia Hasen

C++2, qualified teachers are:

Nesrin Majed

[Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar

Soha Fahad

Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara

Wojdan Mohammed

Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali

Hend Saleh

Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda

Khawlah Abdalaziz

Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan

Layla Saleh

Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara

Sara Naif

Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla

Hessa Abdullah

Bandr]

Amal Mohhamed

Digital Logic, qualified teachers are:

Samah Saab

[Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar

Najla Bandr

Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara

APPENDIX D.

192

Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr] Data Structure, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr] Assembly, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr] Software Engineering, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki] Algorithm, qualified teachers are: [Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid] Architecture, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr] Operating System, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr] Database, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr] Programming Language, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr]

APPENDIX D.

193

Network, qualified teachers are: [Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed] Image Prcessing, qualified teachers are: [Nora Saleh, Manahel Ali] Seminar, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr] Artificial Intelligence, qualified teachers are: [Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid] Network Security, qualified teachers are: [Thoria Khalid, Huda Naser, Nadia Hasen] Mobile Network, qualified teachers are: [Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed] Compiler, qualified teachers are: [Nesrin Majed, Soha Fahad, Wojdan Mohammed] Project Management, qualified teachers are: [Fatmah Mohammed, Ebtehal Turki, Montaha Ali, Entesar Alonzy, Hend Naser, Maha Khalid, Eiman Fahad, Sara Abdullah, Nofe Adel, Ahlam Ali, Mona Hamed, Jwaher Ali, Bedriah Hamed, Nora Saleh, Manahel Ali, Thoria Khalid, Huda Naser, Nadia Hasen, Nesrin Majed, Soha Fahad, Wojdan Mohammed, Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr] Arabization, qualified teachers are: [Hend Saleh, Khawlah Abdalaziz, Layla Saleh, Sara Naif, Hessa Abdullah, Amal Mohhamed, Samah Saab, Najla Bandr]

194

Refrences [1] A.Carvalho, A Cooperative Coevolutionary Genetic Algorithm for Learning Bayesian Network Structures, David Cheriton School of Computer Science University of Waterloo., July 1216, 2011. [2] A.Coello Coello and E. Mezura-montes, ”Handling Constraints in Genetic Algorithms using Dominance-Based Tournaments,” Cinvestav-ipn, p. 12. [3] A.Claudia M. L. Albuquerque, J.D. Melo and A. D. Neto, PARALLEL GENETIC ALGORITHM WITH DIFFERENT EVOLUTION BEHAVIOR FOR MULTILAYER PERCEPTRONS DESIGN AND LEARNING, Departamento de Engenharia de Computa£o e Automa£o Universidade Federal do Rio Grande do Norte., Vol. 2, No. 2, pp. 73-83, 2004 [4] A.GUNAWAN, K. M. NG, K. L.POH, A MATHEMATICAL PROGRAMMING MODEL FOR A TIMETABLING PROBLEM, Department of Industrial and Systems Engineering Faculty of Engineering National University of Singapore., 2005 [5] A.C. Albuquerque, J.D. Melo and A.D. Neto, PARALLEL GENETIC ALGORITHM WITH DIFFERENT EVOLUTION BEHAVIOR FOR MULTILAYER PERCEPTRONS DESIGN AND LEARNING,Revista da Sociedade Brasileira de Redes Neurais, Vol. 2, No. 2, pp. 73-83, 2004. [6] A.Gunawan, K. M. Ng, and K. L. Poh, Solving the Teacher Assignment-Course Scheduling Problem by a Hybrid Algorithm, World Academy of Science, Engineering and Technology., 33 2007. [7] A.Gunawan and K. Ng, Solving the Teacher Assignment Problem by Two Metaheuristics, Singapore Management University and National University of Singapore., pp. 73-14 2011 [8] A.Gunawan and K. M. Ng, A Genetic Algorithm for the Teacher Assignment Problem for a University in Indonesia, National University of Singapore Information and Management Sciences ., 2005.. [9] B.W. Wah and Y.X. Chen, ”Constrained Genetic Algorithms and their Applications in Nonlinear Constrained Optimization,” Department Elect. Comput. Eng. Coordinated Science Laboratory University Illinois, Urbana-champaign, p. 8. [10] D. Abramson and J. Abela, ”A PARALLEL GENETIC ALGORITHM FOR SOLVING THE SCHOOL TIMETABLING PROBLEM,” Australian Comput. Science Conf, p. 11, Feb 1992. [11] D. Beasle and D. Bull, An Overview of Genetic Algorithms Part Fundamentals., UK: University Committee on Computing, 1993, [12] D. Gong, Y. Zhou and T.Li, ”Cooperative Interactive Genetic Algorithm Based on Users Preference,” Int. J. Inform. Technology, vol. 11, no. 10, p. 10, 2005. [Online]. [13] D. Gilbert, The JFreeChart Class Library., Simba Management Limited, 2002,

REFRENCES

195

[14] F.H. KHAN, N. KHAN and S. INAYATULLAH, SOLVING TSP PROBLEM BY USING GENETIC ALGORITHM, International Journal of Basic and Applied Sciences IJBAS., vol. 9, no. 10, pp. 79-10 2010. [15] E. Alba, M. Giacobini and M. Tomassini, ”Comparing Synchronous and Asynchronous Cellular Genetic Algorithms,”. [16] E. Alba and J.M. Troya, ”Analyzing synchronous and asynchronous parallel distributed genetic algorithms,” Future Generation Comput. Syst., no. 17, 2001. [17] E. Cant-Paz, A Survey of Parallel Genetic Algorithms, Department of Computer Science and Illinois Genetic Algorithms Laboratory University of Illinois at Urbana-Champaign., 2000. [18] H.Aguirre. K.Tanka and S.Ohita, Performance Study of Distributed Genetic Algorithm with Parallel Cooperative-Competitve Genetic Operators, IEICE TRANS FUNDAMENTALS,Vol.E85-A, NO.9., SEB 2002. [19] H. Kwasnicka and M. Gierusz, Managing of Cooperative Genetic Algorithms by Intelligent Agent, Institute of Applied Informatics, Wroclaw University of Technology., vol. 19, no. 1, pp. 1-16 2008. [20] J. Carlos Clemente Litran and X. Defago, ”ASYNCHRONOUS PEER-TO-PEER COMMUNICATION FOR FAILURE RESILIENT DISTRIBUTED GENETIC ALGORITHMS,” Japan Advanced Institute of Science and Technology (JAIST), [Online]. [21] J. Majumdar and A. Bhunia, ”Penalty approaches for Assignment Problem with single side constraint via Genetic Algorithms,” India: Journal of Mathematical Modelling and Application, 2010. [22] K. Bryan and Y. Shibberu, ”Penalty Functions and Constrained Optimization,” p. 6. [23] K. Kazunori, M. Himshi and I.Masaaki, ”Asynchronous Parallel Distributed GA using Elite Server,” . [24] K. Kojima, M. Ishigame, G. Chakraborty, H. Hatsuo, and S. Makino, Asynchronous Parallel Distributed Genetic Algorithm with Elite Migration, World Academy of Science, Engineering and Technology., 42 2008. [25] K. Murray, T. Muller and H. Rudov, ”Modeling and Solution of a Complex University Course Timetabling Problem,”. [26] L. Y. Rou, and H. Asmuni, A Study of Cooperative Co-evolutionary Genetic Algorithm for Solving Flexible Job Shop Scheduling Problem, World Academy of Science, Engineering and Technology., 72 2010 [27] M. El-Abd and M. Kamel, A Taxonomy of Cooperative Search Algorithms, Dept. of Electrical and Computer Engineering, University of Waterloo., pp. 32-10 1,2005. [28] M. Gendreau and J. Potvin, Metaheuristics in Combinatorial Optimization, Springer Science + Business Media, Inc. Manufactured in The Netherlands., Annals of Operations Research 140, 189213, 2005. [29] M. Nowostawski and, R. Poli,” Parallel Genetic Algorithm Taxonomy,” KES99, MAY 13, 1999. [30] M. Ohki, S. Uneme and H. Kawano, ”Parallel Processing of Cooperative Genetic Algorithm for Nurse Scheduling,” of 4th Intelligent Systems Symposium, 2008.

REFRENCES

196

[31] M. Resende, P. Pardalos and S. Eksioglu, ”Parallel Metaheuristics for Combinatorial Optimization,” 1999. [32] M.Theile, Exact solutions to the Traveling Salesperson Problem by a population-based evolutionary algorithm, Institut fur Mathematik, TU Berlin, Germany. 2009. [33] M. T. McMahon, A DISTRIBUTED GENETIC ALGORITHM WITH MIGRATION FOR THE DESIGN OF COMPOSITE LAMINATE STRUCTURES, Faculty of the Virginia Polytechnic Institute and State University., August, 1998 [34] N. Keerativuttitumrong , N. Chaiyaratana1 and V.Varavithya2, Multi-objective Cooperative Co-evolutionary Genetic Algorithm, Research and Development Center for Intelligent Systems.2003. [35] P. Borovska, Solving the Travelling Salesman Problem in Parallel by Genetic Algorithm on Multicomputer Cluster, International Conference on Computer Systems and Technologies., 2005 [36] P. Borovska , ”Efficiency of Parallel Metaheuristics for Solving Combinatorial Problems,” ACM New York: 2007. [37] R. Eriksson and B. Olsson, cooperartive coevolution in inventory control optimisation, Department of computer science,university of skovde. 17, July 1997. [38] R. Chakraborty, ”Foundamentals of Genetic Algorithms,” June 01 , 2010. [39] R. Smith, S. Forrest and A.S. Perelson, ”Searching for Diverse, Cooperative Populations with Genetic Algorithms,” p. 39. [40] R. Shonkwiler, ”Parallel Genetic Algorithms,” Atlanta: 1993. [41] R. Smith, S. Forrest and A. Perelson, ”Searching for Diverse, Cooperative Populations with Genetic Algorithms,”. [42] R. Venkateswaran, Z.Obradovic and C.S. Reghavendra, Cooperative Genetic Algorithm for Optimization Problem in Distributed Computer System, Washington state university, 1994 [43] S. Tongchim, Coarse-Grained Parallel Genetic Algorithm for Solving the Timetable Problem, Department of Computer Engineering, Faculty of Engineering, Chulalongkorn University, 17, July.2002. [44] T. Crainic and M. Toulouse, ”Parallel Meta-Heuristics,”2009. [45] T. Hiroyasu, R. Yamanaka, M.Yoshimi and M.MIKI”A Framework for Genetic Algorithms in Parallel Environments,” . [46] Y. Guan, B. Xu and K.R. Leung, Parallel Genetic Algorithms with Schema Migration., 2010. [47] V.Kalyan Chakravarthy, V.V.Venkata Ramana and C.Umashankar, Genetic Algorithmic Approach to Generalized Assignment Problems in Conjunction with Supply Chain Optimization, viewpoint., July 2010. [48] Y.Z. Wang, ”An application of genetic algorithm method for teacher assignment problems,” Department of business administration, far east college, Tainan, Taiwan, ROC, p. 295, 2002. [49] Z. Michalewicz, ”Genetic Algorithms, Numerical Optimization, and Constraints,” Department Comput. Science, University North Carolina, p. 8.

REFRENCES

197

[50] Z.WANG, C.JIN and F.LI, STUDY ON SOLUTION METHOD FOR RANDOM ASSIGNMENT PROBLEM BASED ON GENETIC ALGORITHM, Proceedings of the Sixth International Conference on Machine Learning and Cybernetics, Hong Kong., 19-22 August 2007 [51] C. Blum, Metaheuristics in Combinatorial Optimization: Overview and Conceptual Comparison., Universit a degli Studi di Bologna., [52] E. Alba, PARALLEL METAHEURISTICS., Canada: John Wiley and Sons, 2005. [53] D. Goldberg, in Genetic Algorithm in Search, Optimization, and Machine Learning, Chapter 1-8, Addison-Wesley, pp. 1432, 1989. [54] L. Lemay and C. L. Perkins, Teach Yourself JAVA in 21 Days. New York: Sams.net, 1996. [55] M. Gen and R. Cheng, in Genetic Algorithm and engineering design,, John Wiley and Sons Inc, 1997.M. Melanie, An Introduction to Genetic Algorithms. London: A Bradford Book The MIT Press, 1999. [56] S. Luke, Essentials of Metaheuristics. Mason: George Mason University, 2009. [57] S.N.Sivanandam and S.N.Deepa, Introduction to Genetic Algorithms. New York: Springer Berlin Heidelberg, 2008. [58] J. Puchinger and G.R. Raidl, ”Combining Metaheuristics and Exact Algorithms in Combinatorial Optimization,” Vienna University of Technology,vienna, Austria., [Online]. [59] Handbook of Evolutionary Computation. USA.: A Joint Publication of Oxford University Press and Institute of Physics Publishing, 1995. Course Notes [60] A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithm, chapter3. [61] K. Jakudo, Lab Notes for EE464K, Senior Projects, The University of Texas at Austin, fall semester, 2010. Web pages: [62] IV. Genetic Algorithm, Available algorithms/ga-basic-description.php

at:

http://www.obitko.com/tutorials/genetic-

[63] Konstantin Boukreev , Genetic Algorithms and the Traveling Salesman Problem, Available at: http://www.codeproject.com/Articles/1403/Genetic-Algorithms-and-the-TravelingSalesman-Prob [64] John McCullock (2009) Mutation , Available at: http://mnemstudio.org/genetic-algorithmsmutation.htm [65] John McCullock (2009) Genetic Algorithms And Evolutionary Learning, Available at: http://mnemstudio.org/genetic-algorithms-introduction.htm