Complete.Org: Mailing Lists: Archives: freeciv-dev: August 2001:
[Freeciv-Dev] Re: {gen,spec,sort}list stuff

[Freeciv-Dev] Re: {gen,spec,sort}list stuff

[Top] [All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index] [Thread Index]
To: Justin Moore <justin@xxxxxxxxxxx>
Cc: freeciv-dev@xxxxxxxxxxx
Subject: [Freeciv-Dev] Re: {gen,spec,sort}list stuff
From: Raimar Falke <hawk@xxxxxxxxxxxxxxxxxxxxxxx>
Date: Thu, 30 Aug 2001 10:11:16 +0200
Reply-to: rf13@xxxxxxxxxxxxxxxxxxxxxx

On Wed, Aug 29, 2001 at 05:50:21PM -0400, Justin Moore wrote:
>    A patch for my new genlist.[ch] files against today's CVS is attached.
> I've started to notice some inconsistencies in games using the same random
> seeds (map and game) between games that use the old genlist and ones that
> use my genlist.  Yet I can't find any problems in my implementation of
> genlist (yes, I realize that by saying this I'm opening myself up to
> someone finding incredibly obvious bugs ... which is almost the point).
> I'd appreciate any extra eyes looking at the code.

Is it possible to have both lists in parallel and compare their results.

>    There are currently two kinds of list iteration macros: one (STABLE_*)
> for lists where the number of elements doesn't change and another (REAL_*)
> for those that do (REAL_ because that's what most use ATM).  The REAL_*
> macros use the iterators, which are painfully slow but are very tolerant
> of one or more elements being removed from underneath them.

I'm not happy with the name REAL_.

>    A side-effect of making the genlist iterators functions: they show up
> in profiling so you can tell exactly what percentage of the old loops
> you've removed by converting to the new macros.
> > > > > I'm open to suggestions on how to cleanly remove elements from an
> > > > > array without screwing over iterators.
> > > >
> > > > As pointed you: restarting the loop.
> > > >
> > > > > One possibility that comes to mind is to have an associated array of
> > > > > boolean values which marks the entry as valid or not.
> > > >
> > > > Complicated.
> > >
> > >    Somewhat, but possibly more efficient than restarting the loop.  I'll
> > > see as I go along the pros and cons of each.  Plus, what if there are
> > > cases in which the loop must adhere to "only-once" semantics?  Some
> > > function that will break or return incorrect results if certain items are
> > > iterated over more than once?
> >
> > Another solution is a seperate list:
> >
> > init_list(&new_list);
> >
> > for(;;)
> > {
> >   if(list_get_size(list)==0)
> >      break;
> >   item=list_unlink_first(list);
> >   if(condition)
> >   {
> >     destroy_item(item);
> >   }
> >   else
> >   {
> >      list_append(tmp_list,item);
> >   }
> > }
> > list=tmp_list;
> >
> > Clean, safe and efficient (not for long lists). Only some more writing
> > is needed. I can image a list_filter method which does all the above
> > and only takes the list and two callbacks (for condition testing and
> > destroying).
>    The problem isn't so much lists where the destruction is done within
> the body of the loop.  The problem is loops where three function calls
> down from the body of the loop, THAT function MAY remove an element from
> the list, so by the time you work back up the call stack, an element is
> gone and you don't even know it in the original loop.  Which is where
> having intelligent iterators comes in handy.
>    For the time being I'm going to go with the approach of "If you think
> this loop may remove elements from the list, use the iterators", and if
> not, use the new ones.  I'm pretty sure we can migrate a good percentage
> of the loops over to the new macros.
> > >    Removing current data is ok, but not 'next'.  Another possible (but
> > > more involved) solution is to keep track of how many iterators are
> > > currently looping over a list.  When an item is removed from a list, go
> > > through and find if that deletion messes with any of the iterators.  More
> > > complex, but more robust.  Comments?
> >
> > Too complex. Just fix the code which does something like this. From a
> > performance point of view: if you try to catch all cases you would be
> > slow. And the whole point is to be fast(er).
> >
> >     Raimar
> >
> > --
> >  email: rf13@xxxxxxxxxxxxxxxxx
> >  "Premature optimization is the root of all evil."
> >     -- D. E. Knuth in "Structured Programming with go to Statements"
>    I enjoy how your last sentence is somewhat at odds with your sig. :)
> IMHO, the _long-term_ goal is to be faster, but ATM it's just to get the
> code correct because I *know* the code will be faster once we replace many
> instances of iterators with STABLE_ macros.
>    Not to fan an old flame, but I think this is a good argument for having
> a development branch; we could put almost-100%-working code into it and
> have a larger number of people bang on it.  For the moment I just have to
> plead with the list to apply my patch and find where things break. :) It
> hasn't crashed my games out or done any serious damage as far as I can
> tell; please, give it a shot. *sad look in eyes*

Please people test it. Looking at it I think that you should be more
restrictive with your arguments:
 - pgenlist->capacity should be set for every list and yet you test
 for it in genlist_resize
 - is this mess with the current genlist_sort and NULL needed? It is
 - which code unlinks none items which aren't in the list (assert,


 email: rf13@xxxxxxxxxxxxxxxxx
 "I do feel kind of sorry for Microsoft. Their attornies and marketing
  force must have tons of ulcers trying to figure out how to beat (not
  just co-exist with) a product that has no clearly defined (read
  suable) human owner, and that changes on an hourly basis like the
  sea changes the layout of the sand on a beach. Severely tough to
  fight something like that."
    -- David D.W. Downey at linux-kernel

[Prev in Thread] Current Thread [Next in Thread]