[Freeciv-Dev] Re: Unsafe assertions. (PR#864)
[Top] [All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index] [Thread Index]
On Sat, 28 Jul 2001, Gaute B Strokkenes wrote:
> >> Note that I did not weaken the assertions. normalize_map_pos()
> >> returns TRUE even if the tile is not normalized.
> >
> > But when the author wrote the assertion, did they mean to assert
> > that it was a real tile, or that it was a normalized position?
>
> In the absence of any further evidence, it seems reasonable to assume
> that the original author actually bothered to have a look at what
> normalize_map_pos() before they decided to use it. So I think we may
> safely assume the former.
Except it is used both in places that don't need to be normalized and places
that do need to be normalized.
> > But who wrote that mess anyway? Maybe they can shed some light on
> > what the hell they were thinking.
>
> I don't think it's a mess and no, I didn't write it. This part of the
So asserts with side-effects that are sometimes required and sometimes not all
over the place isn't a mess? I must have high standards.
> code seems to have been written with the Golden Rule in mind: be
> liberal in what you accept, and conservative in what you emit. In
> other words: accept non-normalised coordinates without barfing, and
> try to make sure that any coordinates you do produce (or pass to other
> functions) are normalised. I think that's a perfectly reasonable
> approach to take.
You get more bugs that way. Every bit of code needs to insure that its output
is correct, and that its input is correct. But remembering to always check
everything is hard to do, and people WILL forget to check their output, or
input, or both. When they do forget, it's not noticed, because everything
else accepts bad input and doesn't produce bad output. Until you get place
with a mistake interfacing with another place with a mistake. But now it's
hard to find the problem because the mistakes have gone so long unnoticed.
> Another reasonable approach would be to always normalise coordinates
> when you create them, and make it a hard error to attempt to use
> non-normalised coordinates. This would avoid the overhead of
> normalising everything twice.
That way gives the best performance, and it's also what I used as an my
freeciv policy back in 1998. It's not just twice that you have the overhead,
it's many times. Think about data passed down a stack 10 functions deep, each
one checks both the input and output for correctness. Only the first one
needs to actually insure correctness, and it can do this faster since that
code knows if and when the output might be wrong. Also consider a unit the AI
looks at one million time for various calculations. It's much faster if the
AI can assume the coordinates of the unit are valid and not have to check them
each time it looks at the unit.
There is also a third way, that's best from the perspective of code
correctness. You made the data "opaque" so that it can only be compared,
modified, or derefereced via access functions. You then insure that this
small set of function will never produce or fail because of bad data. Then
all the rest of the code is officially let off the hook when it comes to
correctness. The problem with this is that it's slower. There is no way (and
can't be a way) to tell the access functions that you know you're passing them
correct data, or that the modification you want to make will not cause a
problem, and they don't need to deal with that.
You can see these two approaches in things like strings. C uses the fast
method of assuming good output. If you call printf or strcat, you need to
take care not to overflow a buffer by passing one too small. Something like
perl uses the slower, safer method of checking for null pointers or
overflowing a buffer every time you preform a string operation.
|
|