any value in a standard for defining bit flags?

Robert P. J. Day rpjday at mindspring.com
Mon Jul 3 09:19:36 UTC 2006


On Sun, 2 Jul 2006, Paul Fox wrote:

>  >
>  >   i don't feel like doing any real work so i'm just going to
>  > pontificate.  the code for defining bit flags in BB is a little
>  > chaotic as there are three standards scattered throughout the code.
>  ...
>  >   wouldn't it be easier to pick one and go with that?  certainly,
>
> while i myself would never define bit numbers in decimal, i have
> no problem maintaining them, or spotting errors in them.  i guess
> all i'm getting at is that this is pretty far down on my list of
> busybox peeves...

i *did* say i didn't feel like doing any real work. :-)  i wouldn't
suggest *enforcing* any kind of standard like that, but maybe just a
list of recommended or suggested standards for an applet coding style?

in any event, as you wrote, while defining bit flags using either hex
constants or bit shifts is equally reasonable, defining them with
literal power-of-two decimal constants does seem to be unnecessarily
obtuse.  just my $0x02c.

rday

p.s.  the other reason i prefer a bit of a standard in this case is to
visually distinguish when someone is using, say, the value 1024 as
maybe a buffer size, or using it as a bit mask.  if i see "1024", i
might conclude it's defining an array to hold a block.  if, however, i
see "0x400", i'd more likely assume this was meant as a bit setting.
it might occasionally make it easier to spot typoes.  but i'm not
going to lose any sleep over this.



More information about the busybox mailing list