Learning in abstract memory schemes for dynamic optimization.
dc.contributor.author | Richter, Hendrik | en |
dc.contributor.author | Yang, Shengxiang | en |
dc.date.accessioned | 2013-06-11T16:01:39Z | |
dc.date.available | 2013-06-11T16:01:39Z | |
dc.date.issued | 2008 | |
dc.description.abstract | We investigate an abstraction based memory scheme for evolutionary algorithms in dynamic environments. In this scheme, the abstraction of good solutions (i.e., their approximate location in the search space) is stored in the memory instead of good solutions themselves and is employed to improve future problem solving. In particular, this paper shows how learning takes place in the abstract memory scheme and how the performance in problem solving changes over time for different kinds of dynamics in the fitness landscape. The experiments show that the abstract memory enables learning processes and efficiently improves the performance of evolutionary algorithms in dynamic environments. | en |
dc.identifier.citation | Richter, H. and Yang, S. (2008) Learning in abstract memory schemes for dynamic optimization. In: Proceedings of the 4th International Conference on Natural Computation, Jinan, China, October 2008. Vol. 1. New York: IEEE, pp. 86-91. | en |
dc.identifier.doi | https://doi.org/10.1109/ICNC.2008.110 | |
dc.identifier.isbn | 978-0-7695-3304-9 | |
dc.identifier.uri | http://hdl.handle.net/2086/8730 | |
dc.language.iso | en | en |
dc.peerreviewed | Yes | en |
dc.publisher | IEEE | en |
dc.researchgroup | Centre for Computational Intelligence | en |
dc.researchinstitute | Institute of Artificial Intelligence (IAI) | en |
dc.subject | Abstract memory | en |
dc.subject | Evolutionary algorithm | en |
dc.subject | Learning | en |
dc.title | Learning in abstract memory schemes for dynamic optimization. | en |
dc.type | Article | en |
Files
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 3.18 KB
- Format:
- Item-specific license agreed upon to submission
- Description: