[freeciv-ai] (PR#6595) Optimize/reimplement CM
[Top] [All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index] [Thread Index]
<URL: http://rt.freeciv.org/Ticket/Display.html?id=6595 >
> [bhudson - Tue Nov 11 17:46:48 2003]:
> I'm not confident the patch is as stable as alpha yet, but I'll play
> around a bit. Also, I want to check that it's actually faster. The
> cache3 hit rate is certainly up a lot just when loading the game, and I
> expect it to be dramatically improved when you actually play. It
> remains to be seen if that translates to less time / call.
So now I've submitted the patch (6912). And it's very slightly faster.
Things to make it faster:
- cache1/cache2 get cleared every call because I don't know how to check
whether they are valid. These cache the effects of buildings, wonders,
etc on production, money, and science; the effect of trade routes and
corruption on trade; the effect of luxuries and buildings and whatever
else on happiness. It seems like there should be some way to check
whether any of these have changed, and only then invalidate the results
that are invalid. Of note: the current code doesn't check these and
just assumes they haven't changed. But the cache gets wiped when we
- when we have that, if all the caches are valid and we're optimizing
for the same parameters again, we can simply return the previous answer.
This happens when you buy a building for instance.
- cache3 gets wiped when the city size changed. Given the same set of
tiles, however, if the city grew then that just means we can add
combinations that have more workers; and if the city shrank, that just
means we should eliminate combinations that require more workers than we
have (but actually we can just not use those combinations, and if the
city grows again, we'll have them all for free!).
- similarly, cache3 gets wiped when the set of tiles changes. But (a) a
tile not used in any optimal combination doesn't matter (think glacier).
And (b) I'll write a long post tomorrow about how to update when a tile
we're using changes (it'll subsume (a)).
The following are ways to speed up even without caching across calls to
- we can do better than checking all arrangements of entertainers,
scientists, and taxmen. We can check how many entertainers are needed
to satisfy the constraints in log(n) time by binary search, and the same
for taxmen. If e+t > number of specialists available, we're done: we
can't satisfy the constraints. Otherwise, we only look at arrangements
that use at least that many entertainers and taxmen.
- when doing binary search, the big cost will be querying the city for a
particular allocation of workers and specialists. Since we're searching
for both entertainers and taxmen, we may as well be smart about any
query answering questions about both. That is, if I have n specialists
to use, and I want to know whether the city is happy with n/2
entertainers, the remaining n/2 specialists should be taxmen.
- when doing binary search, we should also be smart about using what we
know. If we know 3 taxmen isn't enough, no point checking whether 2 is
- if the parameters say that, for instance, luxuries are worthless
(beyond what's needed to satisfy the happiness requirement), then
instead of an O(n^2) loop to get the best arrangement of e+s+t=n we can
do an O(n) loop to get the best arrangement of s+t=n-e, with e fixed at
the minimum required value.