IWLCS 2010 - Discussion session on LCS / XCS(F)

I just got an email from Martin Butz about a discussion session being planned for IWLCS 2010 and his request to pass it along. Hope all is well and you are going to attend GECCO this year. Regardless if you attend or not: Jaume asked me to lead a discussion session on “LCS representations, operators, and scalability – what is next?” … or similar during IWLCS… Basically everything besides datamining, because there will be another session on that topic. So, I am sure you all have some issues in mind that you think should be tackled / addressed / discussed at the workshop and in the near future. Thus, I would be very happy to receive a few suggestions from your side – anything is welcome – I will then compile the points raised in a few slides to try and get the discussion going at the workshop. Thank you for any feedback you can provide. Looking forward to seeing you soon! Martin P.S.: Please feel free to also forward this message or tell me, if you think this Email should be still sent to other people… —- PD Dr. Martin V. Butz <butz@psychologie.uni-wuerzburg.de> Department of Psychology III (Cognitive Psychology) Roentgenring 11 97070 Wuerzburg, Germany http://www.coboslab.psychologie.uni-wuerzburg.de/people/martin_v_butz/ http://www.coboslab.psychologie.uni-wuerzburg.de Phone: +49 (0)931 31 82808 Fax: +49 (0)931 31 82815 ...

Jun 21, 2010 · 2 min · 217 words · Xavier Llorà

LCS and Software Development

“On the Road to Competence” is a slide deck by Jurgen Appelo with interesting analogies between learning classifier systems and software development. Definitely worth taking a look at it.

Jun 18, 2010 · 1 min · 29 words · Xavier Llorà

GAssist and GALE Now Available in Python

Ryan Urbanowicz has released Python versions of GAssits and GALE!!! Yup, so excited to see a new incarnation of GALE doing the rounds. I cannot wait to get my hands on it. Ryan has also done an excellent job porting UCS, XCS, and MCS to Python and making those implementations available via “LCS & GBML central” for people to use. I think Ryan’s efforts deserve recognition. His code is helping others to have an easier entry to the LCS and GBML. More information about Ryan’s implementations can found below ...

Jun 11, 2010 · 1 min · 107 words · Xavier Llorà

LCS & GBML Central Gets a New Home

Today I finished migrating the LCS & GBML Central site from its original URL (http://lcs-gbml.ncsa.uiuc.edu) to a more permanent and stable home located at http://gbml.org. The original site is already currently redirecting the trafic to the new site, and it will be doing so for a while to help people transition and update bookmarks and feed readers. I have introduced a few changes to the functionality of the original site. Functional changes can be mostly summarized by (1) dropping the forums section and (2) closing comments on posts and pages. Both functionalities, rarely used in their current form, have been replaced by a simpler public embedded Wave reachable at http://gbml.org/wave. The goal, provide people in the LCS & GBML community a simpler way to discuss, share, and hang out. About the feeds being aggregated, I have revised the list and added the feeds now available of the table of contents from ...

Jun 4, 2010 · 1 min · 208 words · Xavier Llorà

Large Scale Data Mining using Genetics-Based Machine Learning

Below you may find the slides of the GECCO 2009 tutorial that Jaume Bacardit and I put together. Hope you enjoy it. Slides Abstract We are living in the peta-byte era.We have larger and larger data to analyze, process and transform into useful answers for the domain experts. Robust data mining tools, able to cope with petascale volumes and/or high dimensionality producing human-understandable solutions are key on several domain areas. Genetics-based machine learning (GBML) techniques are perfect candidates for this task, among others, due to the recent advances in representations, learning paradigms, and theoretical modeling. If evolutionary learning techniques aspire to be a relevant player in this context, they need to have the capacity of processing these vast amounts of data and they need to process this data within reasonable time. Moreover, massive computation cycles are getting cheaper and cheaper every day, allowing researchers to have access to unprecedented parallelization degrees. Several topics are interlaced in these two requirements: (1) having the proper learning paradigms and knowledge representations, (2) understanding them and knowing when are they suitable for the problem at hand, (3) using efficiency enhancement techniques, and (4) transforming and visualizing the produced solutions to give back as much insight as possible to the domain experts are few of them. This tutorial will try to answer this question, following a roadmap that starts with the questions of what large means, and why large is a challenge for GBML methods. Afterwards, we will discuss different facets in which we can overcome this challenge: Efficiency enhancement techniques, representations able to cope with large dimensionality spaces, scalability of learning paradigms. We will also review a topic interlaced with all of them: how can we model the scalability of the components of our GBML systems to better engineer them to get the best performance out of them for large datasets. The roadmap continues with examples of real applications of GBML systems and finishes with an analysis of further directions. ...

Jul 15, 2009 · 2 min · 326 words · Xavier Llorà