>int getchar( void );
>int
>getCHAR
Fricking toy language.
CRIME Shirt $21.68 |
Tip Your Landlord Shirt $21.68 |
CRIME Shirt $21.68 |
>int getchar( void );
>int
>getCHAR
Fricking toy language.
CRIME Shirt $21.68 |
Tip Your Landlord Shirt $21.68 |
CRIME Shirt $21.68 |
redditfrog please, how is getchar supposed to return a value that's not a char when it needs to signal an EOF? Englighten us.
Should've made EOF a char??
which char is supposed to become invalid value when all values of char are valid characters?
Why the frick should I know? Ask boomers who came up with this moronic design. They could've used 0 for the EOF and 1 for null terminator (which is a moronic design by itself)
>0 for EOF
0 is a valid character and therefore cannot be used to signal end of file
It's only valid because moronic boomers said so.
it is always valid
This zoomer frog with the puffy hair is so funny :^)
Wouldn't you just return EOT, which is 0x04?
Why the frick are you autists signaling inband? Whay is a "getchar" even concerned with an EOF?
I do not know nor endorse use of getchar, it's your fault for wanting to do it.
Make another function called char eof(FILE*) that checks for EOF. Just call it before calling getchar.
how about just read(2) and write(2) more than one byte and since now you get amount of bytes you read back into your buffer you already know if you ran out of bytes to read when read returns 0?
fread wraps these syscalls and it just works.
Because that's antisemitism you fricking hamas chud.
What's the problem?
There are 256 possible values that can be stored in a char.
getchar needs to be able to return 257 possible values -- all values that fit in a char, plus EOF.
> BEL (0x07 = a), which causes a terminal to beep and/or flash.
Wonderful use of valuable space. I'm sure there are plenty of useless boomer idiotic characters that could've been used as EOF.
Suppose I am writing a parser for an image format. I have piped the image into stdin for my parser to read. The image can contain values in the range 0-255 for each color channel of each pixel. Here, a char is not representing text but binary data. How are we to represent that there are no more pixels left to read?
> Here, a char is not representing text but binary data.
???????? CHAR should represent a CHARacter????? Why the frick are cniles so moronic?
And chars are represented by numbers.
> chars are represented by numbers.
Cniles read this shit and see nothing wrong.
Why would you have special 'CHAR' type then???
Because it tells the compiler what to expect in that type so it can optimize the assembly it generates.
This zoomer does not understand that all data is bytes. He thinks the computer magically writes letters into memory without using numbers
It's not special. It's a base integer data type. There are 5 base integer data types, plus the unsigned versions:
char (minimum 8 bits)
short (minimum 16 bits)
int (minimum 16 bits)
long (minimum 32 bits)
long long (minimum 64 bits)
Each platform is free to define the size of these data types provided that:
sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long) <= sizeof(long long).
The char data type is used for bytes. A character is just one type of byte. But there is no reason why char necessarily represents an ASCII-encoded character, or even any form of text data.
> But there is no reason why char necessarily represents a char
Cniles read this and see nothing wrong.
Why the frick would you have a type for a char then????????
We have a type for bytes. We call it char, because the most common use for a single byte is to store a character. But it can store any one byte value.
By the way, the type of a character literal is int, not char.
>we're using CHARACTER type for representing the bytes!
>why you couldn't name the type 'byte' and use the different type for characters?
>w-w-we just couldn't, okay??? SHUT UP!!
Toy language.
>type of a character literal is int, not char.
Fricking moronic toy language.
Here's something for ya: many people think byte means 8 bits. It doesn't -- that's what an octet is. A byte is the smallest addressable unit of memory on a machine... which is usually 8 bits, but some machines have word sizes of 18 or 36, and used 9 bit bytes. A char can be 9 bits on these platforms. Now imagine the confused morons screeching that the byte data type was sometimes 9 bits. Imagine having to explain to someone why we have a BYTE_BITS constant.
You are learning that conventions are done for our own sake to abstract away the fact that computers are just binary machines.
Now you just have to get over it.
the type for bytes is uint8_t, char's size is implementation dependent and shouldn't be used to represent exactly a byte of data.
You can make that logical distinction, but uint8_t is always a typedef to unsigned char or char. Because char is at least 8 bits, and char is the smallest C type.
>By the way, the type of a character literal is int, not char.
Is there a difference to the "type" of the literal and the effect of integer promotion?
You definitely shouldn't use a char for that Black person, not only is the size of char left to implementation, it's also not it's fricking purpose. You use int_ for that, in your case 16 is enough.
sizeof(char) == 1 always
But yes, for byte of data you should use unsigned char.
See
That is absolutely not true, C never enforces a specific size for it's primitives, hence why uint_ and int_ exists. I've already found issues with int not being the same fricking size in two different stdlibs.
"sizeof(char) == 1" is always true, moron. Learn some fricking C. sizeof returns the size in units of char, not bytes.
Black person read the actual ISO definition of C, it is not guaranteed to be a byte or "1" unit, it happens that it's most common size but is by no means guaranteed
Standard literally says so btw.
https://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf
What then about 2 byte char platforms? I can't see it guaranteeing to give a 1 and also letting the size of char up to implementation, something else must be missing.
>2 byte
you mean CHAR_BIT == 16 and sizeof(char) == 1?
Byte is a nonstandard thing, just because your babyduck syndrome makes you think that byte means CHAR_BIT = 8, doesn't mean it's true, there used to be machines with moronic sizes like 13 bits per byte and it worked just fine.
Ah I see my mistake, so if you need 8bit you should use uint, but if you need a "byte" you can use the primitives. Still for reading a file you should use uint since most parsers assume 8bit byte
on such platform uint8 won't ever be used, but normally if you use such platform you already know this and can optimize for case that each byte is 16 bits
Anyways why do we need int getchar again? since char already goes from -127 to 127 and the ASCII table only has 127 characters?
I don't know which tard came up with idea of reading one char at a time when it's never a good solution. A good solution is allocating big buffer and copying it into it, and an even better one is simply memory mapping a buffer so it's already transparently in memory, maybe it's slower at the start but you can use SIMD on incoming data without worrying about copying shit anywhere so it's good.
I don't really like how the standard defines things.
>byte is implementation defined collection of bits
how is this convoluted?
Why can't they just say that a char is a byte?
Why can't they say 7-bit ASCII instead of "basic execution character set"?
The C spec is way too abstract and roundabout.
And of course, the answer is IBM
https://en.wikipedia.org/wiki/EBCDIC
Well it's a lawyer tier language, always has been this way, and I don't make the rules.
Also everyone knows that ASCII is 7 bit, only a heretic would think otherwise, so noone needs to talk about it.
The most cringe thing about C is that char is actually implementation defined whether it's unsigned or signed, and while that's annoying, it simply is proof that ASCII is 7 bit because 8th bit is a sign.
>The most cringe thing about C is that char is actually implementation defined whether it's unsigned or signed,
What's even more cringe is that "char", "signed char", and "unsigned char" are all distinct types (even if char has exactly the same range and semantics as one of the latter).
in C they aren't, in C++ now we have 4 of them, 4th being neutered version of unsigned char called std::byte
My vote is for form feed or vertical tab. Nobody uses that shit anymore.
*reserves thousands of codepoints for emoji and skin tone modifiers*
those aren't ASCII tho
>oh no no 1 ASCII codepoint is wasted on vertical tab!!!!111111111
>leaves 128-255 undefined
>another dunning kruger frogposter thread
you're just stupid. go play videogames and don't make more threads
>While it shows how clever the likes of K&R were, you should probably be looking at something more ... newbie-friendly.
You don't understand, CHARs are special in memory, they aren't bits but actual text. If you examine the banks under the microscope you can see the individual teeny tiny letters.
For the guy who insists characters are special... I hate to break it to you.
#include <stddef.h>
size_t this_function_returns_four(void)
{
return sizeof('a');
}
Rare thread where the frogposter is right
Here, finally it's implied that byte = a CHAR_BIT unit, which must mean that char is a byte, and a byte doesn't have to be 8 bits.
Convoluted crap. And I like C.
The CHAR_BIT minimum required value. It can be higher in theory, in practice only some obscure DSP platforms which don't support byte addressing set it higher. I think POSIX even requires that CHAR_BIT==8.
>caring about types in C
even python respects types more than C
Stop writing code that tries to respect CHAR_BIT!=8.
already nobody writes such code, only microcotroller gays might be subject to this and they're on their own, not anyone's problem
https://github.com/search?q=CHAR_BIT&type=code
>shithub login-walling basic features now
were they getting DoS'd or is this M$ just trying to drive fake engagement for some weird ass adware bullshit?
I don't want to know.
reminder that C use to be untyped garbage (it's still garbage).
std::cin >>std::noskipws
compile with gcc and ignore all autists screeching because you used c++