>int getchar( void );. >int. >getCHAR

>int getchar( void );
>int
>getCHAR

Fricking toy language.

Shopping Cart Returner Shirt $21.68

Nothing Ever Happens Shirt $21.68

Shopping Cart Returner Shirt $21.68

  1. 2 weeks ago
    Anonymous

    redditfrog please, how is getchar supposed to return a value that's not a char when it needs to signal an EOF? Englighten us.

    • 2 weeks ago
      Anonymous

      Should've made EOF a char??

      • 2 weeks ago
        Anonymous

        which char is supposed to become invalid value when all values of char are valid characters?

        • 2 weeks ago
          Anonymous

          Why the frick should I know? Ask boomers who came up with this moronic design. They could've used 0 for the EOF and 1 for null terminator (which is a moronic design by itself)

          • 2 weeks ago
            Anonymous

            >0 for EOF
            0 is a valid character and therefore cannot be used to signal end of file

          • 2 weeks ago
            Anonymous

            It's only valid because moronic boomers said so.

          • 2 weeks ago
            Anonymous

            it is always valid

      • 2 weeks ago
        Anonymous

        This zoomer frog with the puffy hair is so funny :^)

    • 2 weeks ago
      Anonymous

      Wouldn't you just return EOT, which is 0x04?

    • 2 weeks ago
      Anonymous

      Why the frick are you autists signaling inband? Whay is a "getchar" even concerned with an EOF?

      • 2 weeks ago
        Anonymous

        I do not know nor endorse use of getchar, it's your fault for wanting to do it.

    • 2 weeks ago
      Anonymous

      Make another function called char eof(FILE*) that checks for EOF. Just call it before calling getchar.

      • 2 weeks ago
        Anonymous

        how about just read(2) and write(2) more than one byte and since now you get amount of bytes you read back into your buffer you already know if you ran out of bytes to read when read returns 0?
        fread wraps these syscalls and it just works.

        • 2 weeks ago
          Anonymous

          Because that's antisemitism you fricking hamas chud.

  2. 2 weeks ago
    Anonymous

    What's the problem?

  3. 2 weeks ago
    Anonymous

    There are 256 possible values that can be stored in a char.
    getchar needs to be able to return 257 possible values -- all values that fit in a char, plus EOF.

    • 2 weeks ago
      Anonymous

      > BEL (0x07 = a), which causes a terminal to beep and/or flash.

      Wonderful use of valuable space. I'm sure there are plenty of useless boomer idiotic characters that could've been used as EOF.

      • 2 weeks ago
        Anonymous

        Suppose I am writing a parser for an image format. I have piped the image into stdin for my parser to read. The image can contain values in the range 0-255 for each color channel of each pixel. Here, a char is not representing text but binary data. How are we to represent that there are no more pixels left to read?

        • 2 weeks ago
          Anonymous

          > Here, a char is not representing text but binary data.

          ???????? CHAR should represent a CHARacter????? Why the frick are cniles so moronic?

          • 2 weeks ago
            Anonymous

            And chars are represented by numbers.

          • 2 weeks ago
            Anonymous

            > chars are represented by numbers.

            Cniles read this shit and see nothing wrong.
            Why would you have special 'CHAR' type then???

          • 2 weeks ago
            Anonymous

            Because it tells the compiler what to expect in that type so it can optimize the assembly it generates.

          • 2 weeks ago
            Anonymous

            This zoomer does not understand that all data is bytes. He thinks the computer magically writes letters into memory without using numbers

          • 2 weeks ago
            Anonymous

            It's not special. It's a base integer data type. There are 5 base integer data types, plus the unsigned versions:

            char (minimum 8 bits)
            short (minimum 16 bits)
            int (minimum 16 bits)
            long (minimum 32 bits)
            long long (minimum 64 bits)

            Each platform is free to define the size of these data types provided that:
            sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long) <= sizeof(long long).

          • 2 weeks ago
            Anonymous

            The char data type is used for bytes. A character is just one type of byte. But there is no reason why char necessarily represents an ASCII-encoded character, or even any form of text data.

          • 2 weeks ago
            Anonymous

            > But there is no reason why char necessarily represents a char

            Cniles read this and see nothing wrong.

            Why the frick would you have a type for a char then????????

          • 2 weeks ago
            Anonymous

            We have a type for bytes. We call it char, because the most common use for a single byte is to store a character. But it can store any one byte value.

            By the way, the type of a character literal is int, not char.

          • 2 weeks ago
            Anonymous

            >we're using CHARACTER type for representing the bytes!
            >why you couldn't name the type 'byte' and use the different type for characters?
            >w-w-we just couldn't, okay??? SHUT UP!!
            Toy language.

            >type of a character literal is int, not char.
            Fricking moronic toy language.

          • 2 weeks ago
            Anonymous

            Here's something for ya: many people think byte means 8 bits. It doesn't -- that's what an octet is. A byte is the smallest addressable unit of memory on a machine... which is usually 8 bits, but some machines have word sizes of 18 or 36, and used 9 bit bytes. A char can be 9 bits on these platforms. Now imagine the confused morons screeching that the byte data type was sometimes 9 bits. Imagine having to explain to someone why we have a BYTE_BITS constant.

          • 2 weeks ago
            Anonymous

            You are learning that conventions are done for our own sake to abstract away the fact that computers are just binary machines.
            Now you just have to get over it.

          • 2 weeks ago
            Anonymous

            the type for bytes is uint8_t, char's size is implementation dependent and shouldn't be used to represent exactly a byte of data.

          • 2 weeks ago
            Anonymous

            You can make that logical distinction, but uint8_t is always a typedef to unsigned char or char. Because char is at least 8 bits, and char is the smallest C type.

            We have a type for bytes. We call it char, because the most common use for a single byte is to store a character. But it can store any one byte value.

            By the way, the type of a character literal is int, not char.

            >By the way, the type of a character literal is int, not char.
            Is there a difference to the "type" of the literal and the effect of integer promotion?

        • 2 weeks ago
          Anonymous

          You definitely shouldn't use a char for that Black person, not only is the size of char left to implementation, it's also not it's fricking purpose. You use int_ for that, in your case 16 is enough.

          • 2 weeks ago
            Anonymous

            sizeof(char) == 1 always
            But yes, for byte of data you should use unsigned char.

          • 2 weeks ago
            Anonymous

            See

            It's not special. It's a base integer data type. There are 5 base integer data types, plus the unsigned versions:

            char (minimum 8 bits)
            short (minimum 16 bits)
            int (minimum 16 bits)
            long (minimum 32 bits)
            long long (minimum 64 bits)

            Each platform is free to define the size of these data types provided that:
            sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long) <= sizeof(long long).

            That is absolutely not true, C never enforces a specific size for it's primitives, hence why uint_ and int_ exists. I've already found issues with int not being the same fricking size in two different stdlibs.

          • 2 weeks ago
            Anonymous

            "sizeof(char) == 1" is always true, moron. Learn some fricking C. sizeof returns the size in units of char, not bytes.

          • 2 weeks ago
            Anonymous

            Black person read the actual ISO definition of C, it is not guaranteed to be a byte or "1" unit, it happens that it's most common size but is by no means guaranteed

          • 2 weeks ago
            Anonymous

            "sizeof(char) == 1" is always true, moron. Learn some fricking C. sizeof returns the size in units of char, not bytes.

            Standard literally says so btw.

            https://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf

          • 2 weeks ago
            Anonymous

            What then about 2 byte char platforms? I can't see it guaranteeing to give a 1 and also letting the size of char up to implementation, something else must be missing.

          • 2 weeks ago
            Anonymous

            >2 byte
            you mean CHAR_BIT == 16 and sizeof(char) == 1?
            Byte is a nonstandard thing, just because your babyduck syndrome makes you think that byte means CHAR_BIT = 8, doesn't mean it's true, there used to be machines with moronic sizes like 13 bits per byte and it worked just fine.

          • 2 weeks ago
            Anonymous

            Ah I see my mistake, so if you need 8bit you should use uint, but if you need a "byte" you can use the primitives. Still for reading a file you should use uint since most parsers assume 8bit byte

          • 2 weeks ago
            Anonymous

            on such platform uint8 won't ever be used, but normally if you use such platform you already know this and can optimize for case that each byte is 16 bits

          • 2 weeks ago
            Anonymous

            Anyways why do we need int getchar again? since char already goes from -127 to 127 and the ASCII table only has 127 characters?

          • 2 weeks ago
            Anonymous

            I don't know which tard came up with idea of reading one char at a time when it's never a good solution. A good solution is allocating big buffer and copying it into it, and an even better one is simply memory mapping a buffer so it's already transparently in memory, maybe it's slower at the start but you can use SIMD on incoming data without worrying about copying shit anywhere so it's good.

          • 2 weeks ago
            Anonymous

            I don't really like how the standard defines things.

          • 2 weeks ago
            Anonymous

            Here, finally it's implied that byte = a CHAR_BIT unit, which must mean that char is a byte, and a byte doesn't have to be 8 bits.
            Convoluted crap. And I like C.

            >byte is implementation defined collection of bits
            how is this convoluted?

          • 2 weeks ago
            Anonymous

            Why can't they just say that a char is a byte?
            Why can't they say 7-bit ASCII instead of "basic execution character set"?
            The C spec is way too abstract and roundabout.

          • 2 weeks ago
            Anonymous

            And of course, the answer is IBM
            https://en.wikipedia.org/wiki/EBCDIC

          • 2 weeks ago
            Anonymous

            Well it's a lawyer tier language, always has been this way, and I don't make the rules.
            Also everyone knows that ASCII is 7 bit, only a heretic would think otherwise, so noone needs to talk about it.
            The most cringe thing about C is that char is actually implementation defined whether it's unsigned or signed, and while that's annoying, it simply is proof that ASCII is 7 bit because 8th bit is a sign.

          • 2 weeks ago
            Anonymous

            >The most cringe thing about C is that char is actually implementation defined whether it's unsigned or signed,
            What's even more cringe is that "char", "signed char", and "unsigned char" are all distinct types (even if char has exactly the same range and semantics as one of the latter).

          • 2 weeks ago
            Anonymous

            in C they aren't, in C++ now we have 4 of them, 4th being neutered version of unsigned char called std::byte

      • 2 weeks ago
        Anonymous

        My vote is for form feed or vertical tab. Nobody uses that shit anymore.

      • 2 weeks ago
        Anonymous

        My vote is for form feed or vertical tab. Nobody uses that shit anymore.

        *reserves thousands of codepoints for emoji and skin tone modifiers*

        • 2 weeks ago
          Anonymous

          those aren't ASCII tho

          • 2 weeks ago
            Anonymous

            >oh no no 1 ASCII codepoint is wasted on vertical tab!!!!111111111
            >leaves 128-255 undefined

  4. 2 weeks ago
    Anonymous

    >another dunning kruger frogposter thread
    you're just stupid. go play videogames and don't make more threads

  5. 2 weeks ago
    Anonymous

    >While it shows how clever the likes of K&R were, you should probably be looking at something more ... newbie-friendly.

  6. 2 weeks ago
    Anonymous

    You don't understand, CHARs are special in memory, they aren't bits but actual text. If you examine the banks under the microscope you can see the individual teeny tiny letters.

  7. 2 weeks ago
    Anonymous

    For the guy who insists characters are special... I hate to break it to you.
    #include <stddef.h>

    size_t this_function_returns_four(void)
    {
    return sizeof('a');
    }

  8. 2 weeks ago
    Anonymous
  9. 2 weeks ago
    Anonymous

    Rare thread where the frogposter is right

  10. 2 weeks ago
    Anonymous

    Here, finally it's implied that byte = a CHAR_BIT unit, which must mean that char is a byte, and a byte doesn't have to be 8 bits.
    Convoluted crap. And I like C.

  11. 2 weeks ago
    Anonymous
  12. 2 weeks ago
    Anonymous

    The CHAR_BIT minimum required value. It can be higher in theory, in practice only some obscure DSP platforms which don't support byte addressing set it higher. I think POSIX even requires that CHAR_BIT==8.

  13. 2 weeks ago
    Anonymous

    >caring about types in C
    even python respects types more than C

  14. 2 weeks ago
    Anonymous

    Stop writing code that tries to respect CHAR_BIT!=8.

    • 2 weeks ago
      Anonymous
    • 2 weeks ago
      Anonymous

      already nobody writes such code, only microcotroller gays might be subject to this and they're on their own, not anyone's problem

      • 2 weeks ago
        Anonymous

        https://github.com/search?q=CHAR_BIT&type=code

        • 2 weeks ago
          Anonymous
          • 2 weeks ago
            Anonymous

            >shithub login-walling basic features now
            were they getting DoS'd or is this M$ just trying to drive fake engagement for some weird ass adware bullshit?

          • 2 weeks ago
            Anonymous

            I don't want to know.

  15. 2 weeks ago
    Anonymous

    reminder that C use to be untyped garbage (it's still garbage).

  16. 2 weeks ago
    Anonymous

    std::cin >>std::noskipws
    compile with gcc and ignore all autists screeching because you used c++

Your email address will not be published. Required fields are marked *