ورود به حساب

نام کاربری گذرواژه

گذرواژه را فراموش کردید؟ کلیک کنید

حساب کاربری ندارید؟ ساخت حساب

ساخت حساب کاربری

نام نام کاربری ایمیل شماره موبایل گذرواژه

برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید


09117307688
09117179751

در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید

دسترسی نامحدود

برای کاربرانی که ثبت نام کرده اند

ضمانت بازگشت وجه

درصورت عدم همخوانی توضیحات با کتاب

پشتیبانی

از ساعت 7 صبح تا 10 شب

دانلود کتاب Evolutionary optimization algorithms. Biologically-Inspired and Population-Based Approaches to Computer Intelligence

دانلود کتاب الگوریتم های بهینه سازی تکاملی رویکردهای الهام گرفته از زیست‌شناسی و مبتنی بر جمعیت به هوش رایانه‌ای

Evolutionary optimization algorithms. Biologically-Inspired and Population-Based Approaches to Computer Intelligence

مشخصات کتاب

Evolutionary optimization algorithms. Biologically-Inspired and Population-Based Approaches to Computer Intelligence

ویرایش:  
نویسندگان:   
سری:  
ISBN (شابک) : 9781118659502, 1118659503 
ناشر: Wiley Blackwell 
سال نشر: 2013 
تعداد صفحات: [776] 
زبان: English 
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) 
حجم فایل: 34 Mb 

قیمت کتاب (تومان) : 30,000



ثبت امتیاز به این کتاب

میانگین امتیاز به این کتاب :
       تعداد امتیاز دهندگان : 6


در صورت تبدیل فایل کتاب Evolutionary optimization algorithms. Biologically-Inspired and Population-Based Approaches to Computer Intelligence به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.

توجه داشته باشید کتاب الگوریتم های بهینه سازی تکاملی رویکردهای الهام گرفته از زیست‌شناسی و مبتنی بر جمعیت به هوش رایانه‌ای نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.


توضیحاتی در مورد کتاب الگوریتم های بهینه سازی تکاملی رویکردهای الهام گرفته از زیست‌شناسی و مبتنی بر جمعیت به هوش رایانه‌ای

یک رویکرد روشن و شفاف از پایین به بالا به اصول اساسی الگوریتم های تکاملی الگوریتم های تکاملی (EAs) نوعی هوش مصنوعی هستند. انگیزه EAها فرآیندهای بهینه‌سازی است که در طبیعت مشاهده می‌کنیم، مانند انتخاب طبیعی، مهاجرت گونه‌ها، ازدحام پرندگان، فرهنگ انسانی و کلنی‌های مورچه‌ها. این کتاب در مورد نظریه، تاریخ، ریاضیات و برنامه ریزی الگوریتم های بهینه سازی تکاملی بحث می کند. الگوریتم‌های ویژگی شامل الگوریتم‌های ژنتیک، برنامه‌ریزی ژنتیک، بهینه‌سازی آنتکلونی، بهینه‌سازی ازدحام ذرات، تکامل متفاوت، بهینه‌سازی مبتنی بر جغرافیای زیستی و بسیاری دیگر می‌شوند. الگوریتم‌های بهینه‌سازی تکاملی: یک رویکرد ساده و از پایین به بالا ارائه می‌کند که به خواننده کمک می‌کند تا درک واضح - اما از لحاظ نظری دقیقی از الگوریتم‌های تکاملی به دست آورد، با تاکید بر پیاده‌سازی. جستجوی غذا، و بسیاری دیگر - و شباهت ها و تفاوت های آنها را با EAهای معتبرتر مورد بحث قرار می دهد، شامل مسائل پایان فصل به علاوه یک کتابچه راهنمای راه حل در دسترس آنلاین برای مربیان، مثال های ساده ای را ارائه می دهد که درک بصری نظریه را در اختیار خواننده قرار می دهد. کد منبع ویژگی ها برای مثال های موجود در وب‌سایت نویسنده، تکنیک‌های ریاضی پیشرفته‌ای را برای تجزیه و تحلیل EA ارائه می‌دهد، از جمله مدل‌سازی مارکوف و مدل‌سازی سیستم پویا. دانش آموزان و متخصصان درگیر در مهندسی و علوم کامپیوتر.


توضیحاتی درمورد کتاب به خارجی

A clear and lucid bottom-up approach to the basic principlesof evolutionary algorithms Evolutionary algorithms (EAs) are a type of artificialintelligence. EAs are motivated by optimization processes that weobserve in nature, such as natural selection, species migration,bird swarms, human culture, and ant colonies. This book discusses the theory, history, mathematics, andprogramming of evolutionary optimization algorithms. Featuredalgorithms include genetic algorithms, genetic programming, antcolony optimization, particle swarm optimization, differentialevolution, biogeography-based optimization, and many others. Evolutionary Optimization Algorithms: Provides a straightforward, bottom-up approach that assists thereader in obtaining a clear—but theoreticallyrigorous—understanding of evolutionary algorithms, with anemphasis on implementation Gives a careful treatment of recently developedEAs—including opposition-based learning, artificial fishswarms, bacterial foraging, and many others— and discussestheir similarities and differences from more well-establishedEAs Includes chapter-end problems plus a solutions manual availableonline for instructors Offers simple examples that provide the reader with anintuitive understanding of the theory Features source code for the examples available on the author'swebsite Provides advanced mathematical techniques for analyzing EAs,including Markov modeling and dynamic system modeling Evolutionary Optimization Algorithms: Biologically Inspiredand Population-Based Approaches to Computer Intelligence is anideal text for advanced undergraduate students, graduate students,and professionals involved in engineering and computer science.



فهرست مطالب

Cover
Title Page
Copyright Page
SHORT TABLE OF CONTENTS
DETAILED TABLE OF CONTENTS
Acknowledgments
Acronyms
List of Algorithms
PART I INTRODUCTION TO EVOLUTIONARY OPTIMIZATION
	1 Introduction
		1.1 Terminology
		1.2 Why Another Book on Evolutionary Algorithms?
		1.3 Prerequisites
		1.4 Homework Problems
		1.5 Notation
		1.6 Outline of the Book
		1.7 A Course Based on This Book
	2 Optimization
		2.1 Unconstrained Optimization
		2.2 Constrained Optimization
		2.3 Multi-Objective Optimization
		2.4 Multimodal Optimization
		2.5 Combinatorial Optimization
		2.6 Hill Climbing
			2.6.1 Biased Optimization Algorithms
			2.6.2 The Importance of Monte Carlo Simulations
		2.7 Intelligence
			2.7.1 Adaptation
			2.7.2 Randomness
			2.7.3 Communication
			2.7.4 Feedback
			2.7.5 Exploration and Exploitation
		2.8 Conclusion
		Problems
PART II CLASSIC EVOLUTIONARY ALGORITHMS
	3 Genetic Algorithms
		3.1 The History of Genetics
			3.1.1 Charles Darwin
			3.1.2 Gregor Mendel
		3.2 The Science of Genetics
		3.3 The History of Genetic Algorithms
		3.4 A Simple Binary Genetic Algorithm
			3.4.1 A Genetic Algorithm for Robot Design
			3.4.2 Selection and Crossover
			3.4.3 Mutation
			3.4.4 GA Summary
			3.4.5 GA Tuning Parameters and Examples
		3.5 A Simple Continuous Genetic Algorithm
		3.6 Conclusion
		Problems
	4 Mathematical Models of Genetic Algorithms
		4.1 Schema Theory
		4.2 Markov Chains
		4.3 Markov Model Notation for Evolutionary Algorithms
		4.4 Markov Models of Genetic Algorithms
			4.4.1 Selection
			4.4.2 Mutation
			4.4.3 Crossover
		4.5 Dynamic System Models of Genetic Algorithms
			4.5.1 Selection
			4.5.2 Mutation
			4.5.3 Crossover
		4.6 Conclusion
		Problems
	5 Evolutionary Programming
		5.1 Continuous Evolutionary Programming
		5.2 Finite State Machine Optimization
		5.3 Discrete Evolutionary Programming
		5.4 The Prisoner's Dilemma
		5.5 The Artificial Ant Problem
		5.6 Conclusion
		Problems
	6 Evolution Strategies
		6.1 The (1+1) Evolution Strategy
		6.2 The 1/5 Rule: A Derivation
		6.3 The (μ+l) Evolution Strategy
		6.4 (μ + λ) and (μ, λ) Evolution Strategies
		6.5 Self-Adaptive Evolution Strategies
		6.6 Conclusion
		Problems
	7 Genetic Programming
		7.1 Lisp: The Language of Genetic Programming
		7.2 The Fundamentals of Genetic Programming
			7.2.1 Fitness Measure
			7.2.2 Termination Criteria
			7.2.3 Terminal Set
			7.2.4 Function Set
			7.2.5 Initialization
			7.2.6 Genetic Programming Parameters
		7.3 Genetic Programming for Minimum Time Control
		7.4 Genetic Programming Bloat
		7.5 Evolving Entities other than Computer Programs
		7.6 Mathematical Analysis of Genetic Programming
			7.6.1 Definitions and Notation
			7.6.2 Selection and Crossover
			7.6.3 Mutation and Final Results
		7.7 Conclusion
		Problems
	8 Evolutionary Algorithm Variations
		8.1 Initialization
		8.2 Convergence Criteria
		8.3 Problem Representation Using Gray Coding
		8.4 Elitism
		8.5 Steady-State and Generational Algorithms
		8.6 Population Diversity
			8.6.1 Duplicate Individuals
			8.6.2 Niche-Based and Species-Based Recombination
			8.6.3 Niching
		8.7 Selection Options
			8.7.1 Stochastic Universal Sampling
			8.7.2 Over-Selection
			8.7.3 Sigma Scaling
			8.7.4 Rank-Based Selection
			8.7.5 Linear Ranking
			8.7.6 Tournament Selection
			8.7.7 Stud Evolutionary Algorithms
		8.8 Recombination
			8.8.1 Single-Point Crossover (Binary or Continuous EAs)
			8.8.2 Multiple-Point Crossover (Binary or Continuous EAs)
			8.8.3 Segmented Crossover (Binary or Continuous EAs)
			8.8.4 Uniform Crossover (Binary or Continuous EAs)
			8.8.5 Multi-Parent Crossover (Binary or Continuous EAs)
			8.8.6 Global Uniform Crossover (Binary or Continuous EAs)
			8.8.7 Shuffle Crossover (Binary or Continuous EAs)
			8.8.8 Flat Crossover and Arithmetic Crossover (Continuous EAs)
			8.8.9 Blended Crossover (Continuous EAs)
			8.8.10 Linear Crossover (Continuous EAs)
			8.8.11 Simulated Binary Crossover (Continuous EAs)
			8.8.12 Summary
		8.9 Mutation
			8.9.1 Uniform Mutation Centered at xi(k)
			8.9.2 Uniform Mutation Centered at the Middle of the Search Domain
			8.9.3 Gaussian Mutation Centered at xi(k)
			8.9.4 Gaussian Mutation Centered at the Middle of the Search Domain
		8.10 Conclusion
		Problems
PART III MORE RECENT EVOLUTIONARY ALGORITHMS
	9 Simulated Annealing
		9.1 Annealing in Nature
		9.2 A Simple Simulated Annealing Algorithm
		9.3 Cooling Schedules
			9.3.1 Linear Cooling
			9.3.2 Exponential Cooling
			9.3.3 Inverse Cooling
			9.3.4 Logarithmic Cooling
			9.3.5 Inverse Linear Cooling
			9.3.6 Dimension-Dependent Cooling
		9.4 Implementation Issues
			9.4.1 Candidate Solution Generation
			9.4.2 Reinitialization
			9.4.3 Keeping Track of the Best Candidate Solution
		9.5 Conclusion
		Problems
	10 Ant Colony Optimization
		10.1 Pheromone Models
		10.2 Ant System
		10.3 Continuous Optimization
		10.4 Other Ant Systems
			10.4.1 Max-Min Ant System
			10.4.2 Ant Colony System
			10.4.3 Even More Ant Systems
		10.5 Theoretical Results
		10.6 Conclusion
		Problems
	11 Particle Swarm Optimization
		11.1 A Basic Particle Swarm Optimization Algorithm
		11.2 Velocity Limiting
		11.3 Inertia Weighting and Constriction Coefficients
			11.3.1 Inertia Weighting
			11.3.2 The Constriction Coefficient
			11.3.3 PSO Stability
		11.4 Global Velocity Updates
		11.5 The Fully Informed Particle Swarm
		11.6 Learning from Mistakes
		11.7 Conclusion
		Problems
	12 Differential Evolution
		12.1 A Basic Differential Evolution Algorithm
		12.2 Differential Evolution Variations
			12.2.1 Trial Vectors
			12.2.2 Mutant Vectors
			12.2.3 Scale Factor Adjustment
		12.3 Discrete Optimization
			12.3.1 Mixed-Integer Differential Evolution
			12.3.2 Discrete Differential Evolution
		12.4 Differential Evolution and Genetic Algorithms
		12.5 Conclusion
		Problems
	13 Estimation of Distribution Algorithms
		13.1 Estimation of Distribution Algorithms: Basic Concepts
			13.1.1 A Simple Estimation of Distribution Algorithm
			13.1.2 Computations of Statistics
		13.2 First-Order Estimation of Distribution Algorithms
			13.2.1 The Univariate Marginal Distribution Algorithm (UMDA)
			13.2.2 The Compact Genetic Algorithm (cGA)
			13.2.3 Population Based Incremental Learning (PBIL)
		13.3 Second-Order Estimation of Distribution Algorithms
			13.3.1 Mutual Information Maximization for Input Clustering (MIMIC)
			13.3.2 Combining Optimizers with Mutual Information Trees (COMIT)
			13.3.3 The Bivariate Marginal Distribution Algorithm (BMDA)
		13.4 Multivariate Estimation of Distribution Algorithms
			13.4.1 The Extended Compact Genetic Algorithm (ECGA)
			13.4.2 Other Multivariate Estimation of Distribution Algorithms
		13.5 Continuous Estimation of Distribution Algorithms
			13.5.1 The Continuous Univariate Marginal Distribution Algorithm
			13.5.2 Continuous Population Based Incremental Learning
		13.6 Conclusion
		Problems
	14 Biogeography-Based Optimization
		14.1 Biogeography
		14.2 Biogeography is an Optimization Process
		14.3 Biogeography-Based Optimization
		14.4 BBO Extensions
			14.4.1 Migration Curves
			14.4.2 Blended Migration
			14.4.3 Other Approaches to BBO
			14.4.4 BBO and Genetic Algorithms
		14.5 Conclusion
		Problems
	15 Cultural Algorithms
		15.1 Cooperation and Competition
		15.2 Belief Spaces in Cultural Algorithms
		15.3 Cultural Evolutionary Programming
		15.4 The Adaptive Culture Model
		15.5 Conclusion
		Problems
	16 Opposition-Based Learning
		16.1 Opposition Definitions and Concepts
			16.1.1 Reflected Opposites and Modulo Opposites
			16.1.2 Partial Opposites
			16.1.3 Type 1 Opposites and Type 2 Opposites
			16.1.4 Quasi Opposites and Super Opposites
		16.2 Opposition-Based Evolutionary Algorithms
		16.3 Opposition Probabilities
		16.4 Jumping Ratio
		16.5 Oppositional Combinatorial Optimization
		16.6 Dual Learning
		16.7 Conclusion
		Problems
	17 Other Evolutionary Algorithms
		17.1 Tabu Search
		17.2 Artificial Fish Swarm Algorithm
			17.2.1 Random Behavior
			17.2.2 Chasing Behavior
			17.2.3 Swarming Behavior
			17.2.4 Searching Behavior
			17.2.5 Leaping Behavior
			17.2.6 A Summary of the Artificial Fish Swarm Algorithm
		17.3 Group Search Optimizer
		17.4 Shuffled Frog Leaping Algorithm
		17.5 The Firefly Algorithm
		17.6 Bacterial Foraging Optimization
		17.7 Artificial Bee Colony Algorithm
		17.8 Gravitational Search Algorithm
		17.9 Harmony Search
		17.10 Teaching-Learning-Based Optimization
		17.11 Conclusion
		Problems
PART IV SPECIAL TYPES OF OPTIMIZATION PROBLEMS
	18 Combinatorial Optimization
		18.1 The Traveling Salesman Problem
		18.2 TSP Initialization
			18.2.1 Nearest-Neighbor Initialization
			18.2.2 Shortest-Edge Initialization
			18.2.3 Insertion Initialization
			18.2.4 Stochastic Initialization
		18.3 TSP Representations and Crossover
			18.3.1 Path Representation
			18.3.2 Adjacency Representation
			18.3.3 Ordinal Representation
			18.3.4 Matrix Representation
		18.4 TSP Mutation
			18.4.1 Inversion
			18.4.2 Insertion
			18.4.3 Displacement
			18.4.4 Reciprocal Exchange
		18.5 An Evolutionary Algorithm for the Traveling Salesman Problem
		18.6 The Graph Coloring Problem
		18.7 Conclusion
		Problems
	19 Constrained Optimization
		19.1 Penalty Function Approaches
			19.1.1 Interior Point Methods
			19.1.2 Exterior Methods
		19.2 Popular Constraint-Handling Methods
			19.2.1 Static Penalty Methods
			19.2.2 Superiority of Feasible Points
			19.2.3 The Eclectic Evolutionary Algorithm
			19.2.4 Co-evolutionary Penalties
			19.2.5 Dynamic Penalty Methods
			19.2.6 Adaptive Penalty Methods
			19.2.7 Segregated Genetic Algorithm
			19.2.8 Self-Adaptive Fitness Formulation
			19.2.9 Self-Adaptive Penalty Function
			19.2.10 Adaptive Segregational Constraint Handling
			19.2.11 Behavioral Memory
			19.2.12 Stochastic Ranking
			19.2.13 The Niched-Penalty Approach
		19.3 Special Representations and Special Operators
			19.3.1 Special Representations
			19.3.2 Special Operators
			19.3.3 Genocop
			19.3.4 Genocop II
			19.3.5 Genocop III
		19.4 Other Approaches to Constrained Optimization
			19.4.1 Cultural Algorithms
			19.4.2 Multi-Objective Optimization
		19.5 Ranking Candidate Solutions
			19.5.1 Maximum Constraint Violation Ranking
			19.5.2 Constraint Order Ranking
			19.5.3 ε-Level Comparisons
		19.6 A Comparison Between Constraint-Handling Methods
		19.7 Conclusion
		Problems
	20 Multi-Objective Optimization
		20.1 Pareto Optimality
		20.2 The Goals of Multi-Objective Optimization
			20.2.1 Hypervolume
			20.2.2 Relative Coverage
		20.3 Non-Pareto-Based Evolutionary Algorithms
			20.3.1 Aggregation Methods
			20.3.2 The Vector Evaluated Genetic Algorithm (VEGA)
			20.3.3 Lexicographic Ordering
			20.3.4 The ε-Constraint Method
			20.3.5 Gender-Based Approaches
		20.4 Pareto-Based Evolutionary Algorithms
			20.4.1 Evolutionary Multi-Objective Optimizers
			20.4.2 The ε-Based Multi-Objective Evolutionary Algorithm (e-MOEA)
			20.4.3 The Nondominated Sorting Genetic Algorithm (NSGA)
			20.4.4 The Multi-Objective Genetic Algorithm (MOGA)
			20.4.5 The Niched Pareto Genetic Algorithm (NPGA)
			20.4.6 The Strength Pareto Evolutionary Algorithm (SPEA)
			20.4.7 The Pareto Archived Evolution Strategy (PAES)
		20.5 Multi-Objective Biogeography-Based Optimization
			20.5.1 Vector Evaluated BBO
			20.5.2 Nondominated Sorting BBO
			20.5.3 Niched Pareto BBO
			20.5.4 Strength Pareto BBO
			20.5.5 Multi-Objective BBO Simulations
		20.6 Conclusion
		Problems
	21 Expensive, Noisy, and Dynamic Fitness Functions
		21.1 Expensive Fitness Functions
			21.1.1 Fitness Function Approximation
			21.1.2 Approximating Transformed Functions
			21.1.3 How to Use Fitness Approximations in Evolutionary Algorithms
			21.1.4 Multiple Models
			21.1.5 Overfitting
			21.1.6 Evaluating Approximation Methods
		21.2 Dynamic Fitness Functions
			21.2.1 The Predictive Evolutionary Algorithm
			21.2.2 Immigrant Schemes
			21.2.3 Memory-Based Approaches
			21.2.4 Evaluating Dynamic Optimization Performance
		21.3 Noisy Fitness Functions
			21.3.1 Resampling
			21.3.2 Fitness Estimation
			21.3.3 The Kaiman Evolutionary Algorithm
		21.4 Conclusion
		Problems
PART V APPENDICES
	Appendix A: Some Practical Advice
		A.1 Check for Bugs
		A.2 Evolutionary Algorithms are Stochastic
		A.3 Small Changes can have Big Effects
		A.4 Big changes can have Small Effects
		A.5 Populations Have Lots of Information
		A.6 Encourage Diversity
		A.7 Use Problem-Specific Information
		A.8 Save your Results Often
		A.9 Understand Statistical Significance
		A.10 Write Well
		A.11 Emphasize Theory
		A.12 Emphasize Practice
	Appendix B: The No Free Lunch Theorem and Performance Testing
		B.1 The No Free Lunch Theorem
		B.2 Performance Testing
			B.2.1 Overstatements Based on Simulation Results
			B.2.2 How to Report (and How Not to Report) Simulation Results
			B.2.3 Random Numbers
			B.2.4 T-Tests
			B.2.5 F-Tests
		B.3 Conclusion
	Appendix C: Benchmark Optimization Functions
		C.1 Unconstrained Benchmarks
			C.1.1 The Sphere Function
			C.1.2 The Ackley Function
			C.1.3 The Ackley Test Function
			C.1.4 The Rosenbrock Function
			C.1.5 The Fletcher Function
			C.1.6 The Griewank Function
			C.1.7 The Penalty #1 Function
			C.1.8 The Penalty #2 Function
			C.1.9 The Quartic Function
			C.1.10 The Tenth Power Function
			C.1.11 The Rastrigin Function
			C.1.12 The Schwefel Double Sum Function
			C.1.13 The Schwefel Max Function
			C.1.14 The Schwefel Absolute Function
			C.1.15 The Schwefel Sine Function
			C.1.16 The Step Function
			C.1.17 The Absolute Function
			C.1.18 Shekel's Foxhole Function
			C.1.19 The Michalewicz Function
			C.1.20 The Sine Envelope Function
			C.1.21 The Eggholder Function
			C.1.22 The Weierstrass Function
		C.2 Constrained Benchmarks
			C.2.1 The C01 Function
			C.2.2 The C02 Function
			C.2.3 The C03 Function
			C.2.4 The C04 Function
			C.2.5 The C05 Function
			C.2.6 The C06 Function
			C.2.7 The C07 Function
			C.2.8 The C08 Function
			C.2.9 The C09 Function
			C.2.10 The C10 Function
			C.2.11 The Cll Function
			C.2.12 The C12 Function
			C.2.13 The C13 Function
			C.2.14 The C14 Function
			C.2.15 The C15 Function
			C.2.16 The C16 Function
			C.2.17 The C17 Function
			C.2.18 The C18 Function
			C.2.19 Summary of Constrained Benchmarks
		C.3 Multi-Objective Benchmarks
			C.3.1 Unconstrained Multi-Objective Optimization Problem 1
			C.3.2 Unconstrained Multi-Objective Optimization Problem 2
			C.3.3 Unconstrained Multi-Objective Optimization Problem 3
			C.3.4 Unconstrained Multi-Objective Optimization Problem 4
			C.3.5 Unconstrained Multi-Objective Optimization Problem 5
			C.3.6 Unconstrained Multi-Objective Optimization Problem 6
			C.3.7 Unconstrained Multi-Objective Optimization Problem 7
			C.3.8 Unconstrained Multi-Objective Optimization Problem 8
			C.3.9 Unconstrained Multi-Objective Optimization Problem 9
			C.3.10 Unconstrained Multi-Objective Optimization Problem 10
		C.4 Dynamic Benchmarks
			C.4.1 The Complete Dynamic Benchmark Description
			C.4.2 A Simplified Dynamic Benchmark Description
		C.5 Noisy Benchmarks
		C.6 Traveling Salesman Problems
		C.7 Unbiasing the Search Space
			C.7.1 Offsets
			C.7.2 Rotation Matrices
References
Topic Index




نظرات کاربران