daneel-ai is a rule-based AI (to be) written in Python. The bot framework is designed to be ruleset-agnostic, with components that should be implemented per ruleset and per desired behavior. More info on the proposal page (this will be merged here in time).
At the start of a turn, the bot downloads the universe data from the server and stores them in a local cache. It then enters the initialization phase. Based on the initialization rules defined in the mod files, the cache is read and constraints are added to the constraint store. The next phase is the resolution phase. The bot uses the constraints to deduce new knowledge about the system via resolution rules listed in the mod files and the rule files and adds this new knowledge to the store. This is the bulk of the work. Afterwards, the finalization phase (in the mod files) searches for constraints that signify orders to be executed and creates the appropriate orders. As such, the resolution phase can be seen as a process that transforms the state of the game to actions to be performed, with the initialization phase providing the state and the finalization phase executing the actions.
If you want to try out the bot, you can. Clone the repository at git://git.thousandparsec.net/git/daneel-ai.git and read the README to get started.
See daneel-ai/tutorial.
A rule file controls the final behavior of the bot. The file consists of 4 parts: the list of mods, the list of constraints, the rules and a section with Python helper code. All 4 parts are mandatory, skipping a part might make the bot hang or crash. Two examples are in the directory currently: rules-rfts and rules-risk.
At any point, you can add a comment line by using a '#' as the first character on the line. You cannot use this technique to make comments on the end of a line, only the first character of each line is checked.
A list of mod files to be read. These control the initialization and finalization phases, and might also predefine constraints and rules. Currently, there are the following mod files: basic, mod-rfts and mod-risk.
Each constraint that is used in the system is listed here, together with the types it uses. A constraint can have a '*' appended to signify this is a longterm constraint. A longterm constraint will not get removed from the constraint store at the end of a turn, all other constraints are removed. Longterm constraints can thus be used to carry over some of the bot's state to the next turn.
The rules roughly follow the CHR syntax, as explained below. The best way to get a feeling for this is to study the examples provided, I think. Coding these rules is a bit different from everyday imperative programming. For tips and pitfalls, see daneel-ai/rules.
Just a section where you can add helper functions. You can also import Python modules. The whole section is executed as Python code and the resulting global variables are inserted in the environment where guards and bodies execute. Python code can make use of the variables "rulesystem", "cache" and "connection" to access the rule system, the libtpclent-py cache and the libtpproto-py connection directly. These can be used to make probe orders, for example. Usage is discouraged though.
A mod file is actually just a python module where some functions and variables can be defined that influence the different phases. Note that this file is parsed with the Python parser, and the rule files use a simple, line-based homemade parser.
Three lists of strings that are interpreted in the same way as the sections of rule files. It should be noted that all code should be considered global, so it is possible that a rule file (or another mod file) overwrites a function in a mod file's functions list.
init(cache,rulesystem,connection)
A function that gets called just once, right before the first turn starts. The cache has been filled with the server's data and the rule system has the rules but no constraints yet. This can for example be used to initialize some long-term constraints or set constants based on static universe data.
startTurn(cache,store)
This function gets called at the start of each turn. The purpose is to read the cache and add its data to the constraint store in a way that is useful to rule file writers. An example is in the basic.py file that reads all objects and adds their position, velocity, ... to the constraint store.
endTurn(cache,rulesystem,connection)
This function gets called once all rule processing is done. The idea here is to search for some form of constraints and change the game state with those, probably by adding orders to the relevant objects or posting messages or something.
TODO: document some useful methods here (findConstraint, addConstraint, others?)
The bot is based on libtpclient-py to handle the connection to the server and the cache of game objects. Furthermore, the client is designed to be modular, each of the 3 phases can be changed to another implementation to allow ruleset-specific information to be considered, or to change the strategies of the AI.
The initialization phase pretty much copies the object hierarchy from the cache into the constraint store. This means that for example constraints of the form planet(5)
and location(5,0,100032,0)
will be added. This depends on the startTurn function in the loaded mods. At the end of the initialization phase, the constraint "cacheentered" is entered into the store.
Now the AI has a bunch of facts about the world and wants to deduce a battle plan from them. We give him a set of rules to apply to the constraint store. The idea of these rules is based on CHR. An example of a rule would be "if the constraint resources(X) exists and X > 500, then remove that constraint and add the constraints resources(X-500) and order_build_ship()".
One thing to note here is that, as soon as a rule can fire, it will fire. This means that the resolution and initialization phase overlap: if a resources(1200) constraint is entered in the startTurn function, the above rule will fire even if not all constraints are entered yet. To prevent the rule from firing, add the cacheentered constraint to the head so the rule doesn't match until all information is entered.
When no more rules can fire because the preconditions don't match with the constraint store, the algorithm enters the finalization phase. The store is checked for predicates of a certain form (probably order_move and order_build etc), picks those out and executes the required actions. This depends on the endTurn functions in the loaded mods.
The program keeps track of a multiset of facts, called constraints. These facts are represented by predicates, possibly with parameters - those familiar with Prolog will be at home. This is useful for abstract reasoning and search strategies.
The exact syntax as it will be in the bot still needs some thinking out. For now, I use some mix of Python and Prolog in this document. In CHR syntax, a general rule looks like this:
name @ kept \ removed <=> guard | body
The name is an optional string, mostly to ease debugging and possibly useful for reflection, although that needs to be researched. The kept \ removed
part is also named the head. Both of those parts are optional. If there is no kept part, we talk about a simplification rule, if there is no removed part, we talk about a propagation rule (for clarity, we write propagation rules as kept ==> body
), if they are both there it is a simpagation rule and if none of them are there it's just a fact to be added to the constraint store, so it's not really a rule. The head consists of a conjunction of constraints only, the guard of Python code, and the body can contain a mix of both.
As an example, the rule from above:
buildships @ resources(X) <=> X > 500 | resources(X-500); order_build_ship
A way to enforce set semantics (no duplicates) for a predicate:
predicate \ predicate <=> pass
A canonical CHR example is the less-then-or-equal relationship (basically, any partial ordering). Note that = means logical unification and can work in both ways (thus this is not really applicable here, it's more of a Prolog example):
reflexivity @ leq(X,X) <=> pass antisymmetry @ leq(X,Y) and leq(Y,X) <=> X = Y transitivity @ leq(X,Y) and leq(Y,Z) ==> leq(X,Z)
The algorithm matches each head to the constraint store in turn. If a match is found, the guard is checked. If it is found to evaluate to True in bool context, the rule fires. The constraints that match the removed part are removed from the store, and those in the body are added to the store. The rules are then tried again from the beginning. This is the naive way of doing the matching, but if needed improvements are possible.
To extend the standard example, assume the rules are the three leq rules defined above, and let the constraint store hold the three facts leq(A,B),leq(B,C),leq(C,A)
. The transitivity rule will fire and add leq(C,A)
from leq(A,B),leq(A,C)
. Then the antisymmetry rule will assert that A = C. The reflexivity rule will remove leq(C,A)
and leq(A,C)
since A = C. The antisymmetry rule will make A=B and reflexivity will remove the remaining constraints. We end with an empty constraint store and the knowledge that A=B=C.
If I have time and the need arises, I might implement parts of proposed extensions of CHR, like rule priorities, negation by absence, and aggregates support (which allows constructs like sum, min, max to appear in rules). It should be noted that these constructs can be implemented with the existing structure, but for clarity it doesn't hurt to have these extension. There might be the need to implement a few Prolog-like predicates that allow efficient work with non-ground variables, this will become clear when coding. If the program turns out to be too slow, the naive matching algorithm can be changed to the more advanced Rete algorithm that builds a graph of which rules to try when constraints are changed, dramatically speeding up the matching. Other algorithms might also be possible (TREAT, LEAPS), but I've never met these before so I'll have to do some research. Changes in the data structures underlying the constraint store and the rules might also provide speedup.
Since SoC 2008 is essentially done, these are a bit lower priority at the moment.
- Increase performance. Currently, a non-trivial bot easily takes minutes to do something that can be done in a few seconds in pure Python. I belief the underlying rulesystem algorithms are not optimal. Improving these might need some research and experimentation though.
- Make not every section in a rulefile mandatory and loosen the parsing there.