Wednesday, April 22, 2009

Meeting Minutes - 22 April 2009

Today I (Meditya) presented a Herbert A. Simon's Paper entitled "Machine as Mind" appeared in "Android Epistemology" by C.Glymour, K.Ford, and P.Hayes.

Defining that the primitives of mind consist of, Symbols, Complex Structures of Symbols, and Processes that Operate on Symbols (Newell & Simon, 1976), the central thesis of his writings states, "Conventional computers can be, and have been programmed to represent symbol structures and carry out processes on those structures in a manner that parallels, step by step, the way human brain does it.".

In his writings he (Simon) basically argued that every angle of human mind is translable into definable representations, and the definition can be translated and embedded to a thinking machine. In order to defend his thesis, he explained several important points of disscussion such as:

- The Concept of Decomposable System
- Two Approaches of Artificial Intelligence (humanoid and non humanoid)
- The Concept of Mind from Psychological Perspective
- Selective Heuristic Search
- Recognition: The Indexed Memory
- Seriality: The Limits of Attention (short term memory)
- The Architecture of Expert System

In addition he also responded and defended his thesis to the disputes that people usually have regarding the angle of thinking that machine could not copy such as:

- Semantics
- Intention
- "Ill Structured" Tasks
- Language Processing
- Intuition
- Insight
- Creativity

At the end of this meeting we have a nice discussion about the state of the art of this writing (1995) and the condition that is happening now, and the extent of the applicability of the Simon's vision in the present and the future.

Wednesday, April 15, 2009

Meeting Minutes - 15 April 2009

Today I presented the paper “Computational Intelligence in Economic Games and Policy Design” by Herbert Dawid, Han La PoutrĂ©, and Xin Yao (IEEE Computational Intelligence Magazine, 3(4), 22–26, 2008). The paper provides an overview of applications of computational intelligence techniques in economics. Both strong and weak points of the use of computational intelligence techniques are discussed. According to the authors, the two most important weak point are the issue of empirical validation and the issue of robustness.

I also discussed my own view on the relation between mainstream economics on the one hand and agent-based computational economics on the other hand. Due to the increasing popularity of experimental economics and evolutionary game theory, mainstream economics focuses more and more on bounded rationality and dynamic (rather than static) analysis. From this perspective, the difference between mainstream economics and agent-based computational economics is smaller than is sometimes thought. I argued that the main difference is between following a mathematical approach (as mainstream economics does) and following a simulation approach (as agent-based computational economics does). There is much to be gained by combining these two approaches. Today’s meeting ended with a discussion of the difference between mathematical analysis and computer simulation, and how this difference relates to the difference between deduction and induction in science. We also discussed the issue of implicit assumptions that are hidden in technical details in agent-based computational economics research.

Thursday, April 9, 2009

Meeting Minutes - 25 March 2009

In the presentation, Romke and Otto presented the outcomes from the master seminar in computational economics. The research was focused on improving the price prediction mechanism of the MinneTAC agent. The MinneTAC agent is an agent-based computer model that is developed by the university of Minnesota in cooperation with the Erasmus University.

The MinneTAC agent competes with other Agent based computer models in the Trading Agent Competition for Supply Chain Management (TAC SCM). The TAC SCM game was designed to come to the best solution for an Agent based computer model that is capable of dealing with the problems of a dynamic supply chain.

In the MinneTAC agent, there is an ensemble, consisting of multiple price predictors used to predict the future market prices. The function of the model selection mechanism is to determine the most accurate price based on the predictions from all the individual predictors making up the ensemble. The advantage of using multiple predictors is the ability to capture more features in the data then a single predictor. The disadvantage of using multiple predictors is that different features are captured that causes different predictions. A second disadvantage is that not every price predictor is performing optimal for every time horizon and quantity of training data. To overcome these disadvantages, there is a dynamic weighting mechanism with adaptive weights developed for the MinneTAC agent. This weighting mechanism has to find to the optimal weights for every price predictor for every time horizon. The weights are learned during the game, while the agent is competing with its competitors for customer orders. When the agent starts, every price predictor has an equal weight. During the game, the MinneTAC agent starts using the optimal weights. This means that the price prediction mechanism is not working with the optimal weights during the first phase of the game.

In our seminar research, we found the optimal weights for every price predictor during the game. This data is used to bootstrap the agent to increase the performance in the first phase of the game.

R.J. Romke de Vries
O.B. ter Haar