gnustep-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: some unsigned/NSInteger to NSUInteger changes in -gui


From: David Chisnall
Subject: Re: some unsigned/NSInteger to NSUInteger changes in -gui
Date: Tue, 10 Apr 2012 18:24:52 +0100

On 10 Apr 2012, at 18:18, Sebastian Reitenbach wrote:

> +     {
> +       int tmp = (int)_selected_item;
> +          [aDecoder decodeValueOfObjCType: @encode(int) at: &_selected_item];
> +          _selected = [_items objectAtIndex: tmp];
> +     }

No, this is still wrong, and I'm not really sure what it's trying to do...

Let's say you're on some big-endian LP64 system.  NSInteger will be a 64-bit 
integer while int will be a 32-bit integer.  You pass a pointer to the 
NSInteger to aDecoder, and tell it that it's a pointer to an int.  It will then 
cast this pointer and will write a 32-bit value into the first 32 bits of the 
ivar.  Unfortunately, because this is a big endian system, you've now set the 
value to something about four billion times as big as it should be...

As I said in my last email, the correct form is:

        int tmp;
        [aDecoder decodeValueOfObjCType: @encode(int) at: &tmp];
        _selected_item = tmp;

This creates an int on the stack and passes a pointer to it to aDecoder, which 
then loads an int-sized value and stores it in the int.  You then assign this 
value to the ivar.  The implicit cast (you can make it explicit if you prefer) 
will extend this to the correct size.

David

-- Sent from my Cray X1


reply via email to

[Prev in Thread] Current Thread [Next in Thread]