enums - type of enumeration constants inside enumerator list in C -


edited

i'm working on multi-platform enumeration parser , found weird behaviour while trying answer question above.

question a)

does c standard determine type of enumeration constants before enumeration declaration completed?

for keil armcc, example:

enum e {     val0 = (signed char)126,     val1,     val2,      size0 = sizeof(val0),      size1 = sizeof(val1),      size2 = sizeof(val2)  }; 

i size0 = 1, size1 = 1, size2 = 8. (if evaluate size of enum constants outside definition, have size of int).

shouldn't equal sizeof( int )? (remembering int, in case, has size of 4 bytes.)

question b)

for keil c251, have following:

signed int value0 = (signed char)-1; enum{ value1 = (signed char)-1 }; enum{ value2 = -1 }; printf( "is value0 equal value1? ---> %s", value0 == value1 ? "yes!" : "no!" ); printf( "is value0 equal value2? ---> %s", value0 == value2 ? "yes!" : "no!" ); 

which prints:

is value0 equal value1? ---> no! value0 equal value2? ---> yes! 

shouldn't both print yes?

is there difference between definitions of value0 , value1 i'm missing, maybe type cast? or compiler bug?

in c (unlike c++), enumeration constant of type int. it's legal refer enumeration constant before end of type declaration.

if keil armcc giving sizeof(val0) != sizeof (int), val0 enumeration constant, keil armcc not conforming c compiler. i've seen other questions here indicate it's non-conforming.

being non-conforming not compiler bug (unless vendor claims it's conforming, far know don't).

as part b:

enum e{     min_signed_char_0 = (signed char)( -128 ),     min_signed_char_1 = -128,     min_signed_char_2 = (  signed int)(signed char)( -128 ),     min_signed_char_3 = (unsigned int)(signed char)( -128 ),     min_signed_char_0_plus_1 = min_signed_char_0 + 1,     min_signed_char_1_plus_1 = min_signed_char_1 + 1,     min_signed_char_2_plus_1 = min_signed_char_2 + 1,     min_signed_char_3_plus_1 = min_signed_char_3 + 1, }; 

almost conforming c compiler should give min_signed_char_{0,1,2} constants value -128 (of type int), , min_signed_char_{0,1,2}_plus_1 constants value -127 (also of type int). possible wiggle room implementation schar_min == -127, possible unlikely, , apparently not case keil compiler(s). if you're getting different results, either it's bug in compiler.

but definition of min_signed_char_3 problematical. int value -128 converted signed char, doesn't change value. converted unsigned int yields uint_max+1-128 (assuming 32 bits, 4294967168). enumeration constant specified value outside range of int constraint violation, requiring diagnostic. (did compile-time warning?) result, if compiler doesn't reject program, undefined.


Comments

Popular posts from this blog

html - Styling progress bar with inline style -

java - Oracle Sql developer error: could not install some modules -

How to use autoclose brackets in Jupyter notebook? -