Page 1 of 1

signed char problems

PostPosted: Feb 18, 2002 @ 4:56pm
by Jim
Having big headaches with the signed char problem.
I need to read in various text file, which are in several different languages. These text strings thus include extended ascii characters - those above 127.

As EVC++ only sees signed chars ( even with the _CHAR_UNSIGNED compiler pragma set) it gives you incorrect output values i.e

'ä'=0xe4=byte(228)=signed char(-28)
'ü'=0xfc=byte(252)=signed char(-4)
'ö'=0xf6=byte(246)=signed char(-10)

I need to obtain the correct ascii values for these letters :evil:

'ä' should give the decimal value 132.

any help would be much appreciated ( even a dirty fix would do).

Jim.

PostPosted: Feb 18, 2002 @ 5:52pm
by Dan East

PostPosted: Feb 18, 2002 @ 5:57pm
by billcow

PostPosted: Feb 18, 2002 @ 5:57pm
by billcow

PostPosted: Feb 19, 2002 @ 11:37am
by Jim
'ä' should give the decimal value 132.

char a = 'ä';
// has a decimal value of -28

unsigned char b = (unsigned char) a;
//has a decimal value of 228

how do I get the correct ansii 132 decimal value?

Jim.

ps. Am I doing somefink real dumb?

PostPosted: Feb 19, 2002 @ 1:35pm
by Dan East
That makes no sense. You compiled the exact code sample you posted? How are you obtaining b's value, via the debugger?

Dan East

PostPosted: Feb 19, 2002 @ 2:17pm
by refractor

PostPosted: Feb 19, 2002 @ 2:33pm
by Dan East

PostPosted: Feb 19, 2002 @ 2:40pm
by refractor

PostPosted: Feb 19, 2002 @ 4:19pm
by Jim

PostPosted: Feb 19, 2002 @ 4:35pm
by refractor


This page probably answers the "why ANSI" part... but I don't know the conversion routine (maybe you'll have to go through ANSI->UNICODE->ASCII?) ... or just stick to Unicode in the first place if you can.

Cheers,

Refractor.