groff
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

sizeof in Macros. (Was: Specifying dependencies more clearly)


From: Ralph Corderoy
Subject: sizeof in Macros. (Was: Specifying dependencies more clearly)
Date: Thu, 10 Nov 2022 09:31:47 +0000

Howdy Alejandro,

> > Okay, here we go for a rant.

Consider the cost of lost opportunities.

> Since I wrote the code from memory, I had a few typos, but the idea
> was there...
>
> >      typedef struct {
> >          size_t  length;
> >          u_char  *start;
> >      } str_t;
> > 
> >      #define length(s)   (sizeof(s) - 1)
> > 
> >      #define str_set(str, text)  do       \
> >      {                                    \
> >          (str)->length = length(test);    \
> >          (str)->start = (u_char *) text;  \
>      } while (0)

And s/test/text/.

The lack of parenthesis around ‘text’ in the assignment to ‘start’ looks
wrong.

> > Of course, cowboy programmers don't need terminating NUL bytes

The Unix filesystem didn't terminate full-length filenames in the
sixteen-byte directory entries; two for the inode number, fourteen bytes
for the filename.

Not using terminating NULs allows substrings to be sliced from the
original without copying the bytes.

> > And the use of that macro is typically for things like:
>
>         str_t  str;
>         str_set(&str, "foo");
>
> > And then some new programmer in the team writes a line of code
> > that's something like:
>
>         str_set(&str, cond ? "someword" : "another");
>
> > Then testing reveals some issue if cond is true.  You see only
> > "somewor"; hmm. 
...
> > I quickly realize that it is due to the ternary operator decaying
> > the array into a pointer and sizeof later doing shit.
...
> > My patch just changes
> >      #define length(s)   (sizeof(s) - 1)
> > to:
> >      #define length(s)   (nitems(s) - 1)
> >
> > (nitems() is defined to be the obvious sizeof division (called
> > ARRAY_SIZE(9) in the Linux kernel)

The bug is the original macro str_set() doesn't document its ‘text’
parameter must be a string constant.  Macros I have to hand are more
explicit: ‘string constant s’.

    /* DIM gives the number of elements in the one-dimensional array a. */
    #define DIM(a) (sizeof (a) / sizeof (*(a)))

    /* LEN gives the strlen() of string constant s, excluding the
     * terminating NUL. */
    #define LEN(s) (sizeof (s) - 1)

By passing a char pointer on what's presumably a 64-bit pointer machine,
P64, sizeof gives the pointer's eight bytes, one is knocked off as if
there were a terminating NUL, and the first seven bytes of "someword"
result.

Your patch switches to nitems() and your description makes me think it's
like DIM() above.  That takes the eight bytes of the pointer given by
the first sizeof and divides it by one, doing nothing, as the second
sizeof gives the size of a u_char.  So we still arrive at eight, knock
one off for the NUL as you write ‘nitems(s) - 1’, giving seven.  So I
don't see why this wouldn't also give "somewor" instead of "someword".
How did it fix the problem?

Unless the value needs to be known at compile time, using strlen(3)
would work in all cases, save much human time at the cost of a little
accumulated machine time, and in the simple common case of a string
constant it will probably be evaluated by any optimising compiler into a
constant.

    typedef struct {
        size_t length;
        u_char *start;
    } str_t;

    #define str_set(str, text) \
        do { \
            (str)->length = strlen(text); \
            (str)->start = (u_char *)(text); \
        } while (0)

-- 
Cheers, Ralph.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]