
In dump-xsd.c is the code
case SMI_BASETYPE_UNSIGNED64: { SmiUnsigned64 min, max;
min = SMI_BASETYPE_UNSIGNED64_MIN; max = SMI_BASETYPE_UNSIGNED64_MAX;
fprintStdRestHead( f, smiType ); smiRange = smiGetFirstRange( smiType ); while( smiRange ) { if( smiRange->minValue.value.unsigned64 < min ) { min = smiRange->minValue.value.unsigned64; } if( smiRange->maxValue.value.unsigned64 > max ) { max = smiRange->maxValue.value.unsigned64; } smiRange = smiGetNextRange( smiRange ); } fprintSegment( f, 0, "<xsd:minInclusive value="%lu"/>\n", (unsigned long)min );
fprintSegment( f, 0, "<xsd:maxInclusive value="%lu"/>\n", (unsigned long)max ); fprintSegment(f, -1, "</xsd:restriction>\n"); break; }
The behaviour may vary on different platforms, but at least on some, the cast (unsigned long) above needs to be (unsigned long long), or perhaps removed, otherwise only the least significant 32 bits of the constant will be used. I'm not certain whether %lu ought to be %llu.

On Mon, Jan 12, 2009 at 09:12:58AM +0100, Arndt Jonasson wrote:
In dump-xsd.c is the code
[...]
The behaviour may vary on different platforms, but at least on some, the cast (unsigned long) above needs to be (unsigned long long), or perhaps removed, otherwise only the least significant 32 bits of the constant will be used. I'm not certain whether %lu ought to be %llu.
Yes, %llu makes more sense; I changed the code to use UINT64_FORMAT (defined in config.h). In general, I think libsmi should be changed to use stdint.h and inttypes.h definitions with fallback definitions for systems that do not have these header files.
/js
participants (2)
-
Arndt Jonasson
-
Juergen Schoenwaelder