I somehow managed to read an hour of the Nectar of Devotion throughout the day. Wow. Amazing book. It gets better every time I read it. Reading it makes me ecstatic, even if I don't understand what it is talking about. The other conference attendees must have been wondering why I was grinning ear-to-ear while reading some book.
Today's tutorial was on "Principles of AI Problem Solving". Three professors talked the group through various "classic" AI methods for solving problems.
All problems in classic AI can be reduced to the satisfiability problem SAT. Some examples of common problems can be abstracted into moving a robot from a certain initial state to a certain goal state on a grid. Four variations are possible:
- Actions are predicable and we can see exact what happens.
- Actions are predicable, but none of the moves are observable.
- Actions' effects are random/probabilistic, but we can see what happens.
- Actions' effects are probabilistic and we can only get partial information about the events on the grid. This is the hardest problem.
The various techniques for solving these problems involve searching different types of graph models of the problem space. Graphs can be transformed in certain ways to improve the efficiency of the search. A simple transformation is, for example, to reorder the graph to start at the node with the smallest domain/branching factor.
One lecturer mentioned that a technique called "hypergraph decomposition" that can be used to break a graph into weighted, equi-sized pieces. The AI problem is thereby divided up and (hopefully) becomes solvable in logarithmic time instead of the usual exponential time necessary to solve NP-complete problems,
I might be able to use this decomposition technique to break up my ontology by using a reasoning dependency structure. That would help a lot. Very interesting. I'll investigate further.