>employment is a sign of useful code
Employment means you write spyware of unscrupulous people. I reiterate.
2 weeks ago
Anonymous
You're a fricking moron
2 weeks ago
Anonymous
You never said I was wring.
To wit:
If you are 1) employed and 2) are worried about security, it means you have access to a database of information about people you should not be collecting in the first place. Ergo, 0% of the code you write is useful and 100% of it is harmful to society.
2 weeks ago
Anonymous
>it means you have access to a database of information about people you should not be collecting in the first place.
Yes, you're a moron and also wrong
2 weeks ago
Anonymous
I am not wrong. The vast majority of corporations out there are spy organizations, gathering information about people they have no business collecting and selling it. The only reason they get mad is because an equally unscrupulous hacker uses the same information for their own private purposes. There is no difference whatsoever between these people. You are exploiting people and trying to protect your own racket from secondary exploitation.
Any decent language compiler will use the same optimizations with some variance based on the language syntax and what the compiler can safely glean from it. Yet C solutions are generally faster at any optimization level including off.
It's fast because it's so old. C doesn't really lend itself to any particular obvious optimizations nor does it have enough standardized hints to reliably signal for them.
>It's fast because it's so old.
What the frick would that have to do with it?
>C doesn't really lend itself to any particular obvious optimizations
It lends itself to multiple obvious and somewhat less obvious optimizations vs. most other languages.
90% of CVEs are memory leaks
29% of CVEs in 2023 were related to overflow or memory corruption. I don't have a break down by language. >noooo they're all C!
Not necessarily. "Memory safe" languages can have bugs which either directly or indirectly result in an exploitable, unsafe situation. Go look at the Rust CVEs. There are multiple examples of overflow and memory corruption. The CPU is not memory safe, which means any "safe" language has to try and enforce safety using code written and verified by human beings. And human beings make mistakes.
It's moronic to downplay unsafe when 90% of vulnerabilities come from it
>It's moronic to downplay unsafe when 90% of vulnerabilities come from it
It's moronic to pull statistics out of your ass and present them as fact.
>hurr rust isn't safe
It is claimed to be as long as you're not using unsafe.
But let me clarify it for you: go lookup the CVEs for ANY language (current and historical). You will find examples of buffer overflows and memory corruption. A surprising number in fact.
2 weeks ago
Anonymous
>It is claimed to be as long as you're not using unsafe
You need to go unsafe to write any meaningful low-level code
I am not wrong. The vast majority of corporations out there are spy organizations, gathering information about people they have no business collecting and selling it. The only reason they get mad is because an equally unscrupulous hacker uses the same information for their own private purposes. There is no difference whatsoever between these people. You are exploiting people and trying to protect your own racket from secondary exploitation.
>I am not wrong
Yes you are, and also a schizo.
2 weeks ago
Anonymous
Why do you think I would accept the false judgments of a con artist who exploits people for a living?
2 weeks ago
Anonymous
>a con artist who exploits people for a living
Schizophrenic moron
2 weeks ago
Anonymous
You seem angry. Go get your shekels, but don't try to pretend you're a good person because you troon out with safe languages while ripping people off.
2 weeks ago
Anonymous
>while ripping people off
Whom have I ripped and how?
2 weeks ago
Anonymous
I already explained it. Go to your production database full of spy information that you collect about people and ask yourself whether it's worth a paycheck to betray people by collecting data on them like you're a mini-NSA unto yourself so you can use that information to your paymaster's advantage... then ask yourself whether you truly provide anything of value to the world.
2 weeks ago
Anonymous
Get a load of this moron lmao
2 weeks ago
Anonymous
You've already given yourself away by exposing your own motives. You are into these "safe" languages, you are concerned about buffer overflows primarily because the business you work for deals in information about a vast number of people it should not have. Simple as.
2 weeks ago
Anonymous
>because the business you work for deals in information about a vast number of people it should not have
Kek what a fricking idiot
2 weeks ago
Anonymous
You keep saying these things but you and I both know why you're so interested in this enterprisey java-esque/rustroon stuff.
You have a collection of information in your possession that is damaging to a large number of people but you don't want to give it up because it's useful to you and you want to exploit it to your advantage.
You are the fox guarding the henhouse.
2 weeks ago
Anonymous
Neither java nor rust. I deal mainly in .NET actually, while working for a Fintech, and I also have a few successful open-source projects.
2 weeks ago
Anonymous
>Fintech
And there you have it. I rest my case.
2 weeks ago
Anonymous
What about it?
2 weeks ago
Anonymous
>>It is claimed to be as long as you're not using unsafe >You need to go unsafe to write any meaningful low-level code
I agree, but that just exposes the reason why C is needed.
But again: if you actually review the CVEs for a language, you will find buffer overflows and memory corruption in every one.
>What the frick would that have to do with it?
C wasn't "the fast language" back in the days, zoom zoom. It only became so because of compilers progress.
Writing fast code used to mean writing Fortran or straight ASM
C was always known for being fast. Do you really think companies were writing OSes in a slow language? I’m not only talking about the different flavors of UNIX. VMS, Windows, later versions of classic Mac OS (started in Pascal), BeOS (C and C++). I could go on.
2 weeks ago
Anonymous
>C was always known for being fast.
That's actually not true, when it first came out the compilers for it weren't very good.
2 weeks ago
Anonymous
Isn't OS's in C because the autists that made C rewrote Unix and C and it's just stuck ever since. I love some C but I wish they added in something like std::cin and std::String
2 weeks ago
Anonymous
>Do you really think companies were writing OSes in a slow language?
The advantage of UNIX and C was portability and small size, i.e. the whole "worse is better" thing.
"C programmer" used to be an insult in Amiga video game devs communities the same way autists now look down on Python users.
It's fast because it's so old. C doesn't really lend itself to any particular obvious optimizations nor does it have enough standardized hints to reliably signal for them.
They just did shit better in the olden days. My three decades old washing machine still going strong when my neigbours new Internet of Shit spinner became end-of-life within three years and broke down in five.
> still fast as frick
C is only fast because of the unreal resources that go into compilers and processors to let programmers pretend that, under the hood, the C abstract machine is actually how modern processors work. More than anything, C is holding back how fast processors can actually be. We're at the limit of how fast we can make C programs run.
>C is only fast because of the unreal resources that go into compilers and processors
See:
Any decent language compiler will use the same optimizations with some variance based on the language syntax and what the compiler can safely glean from it. Yet C solutions are generally faster at any optimization level including off.
[...] >It's fast because it's so old.
What the frick would that have to do with it?
>C doesn't really lend itself to any particular obvious optimizations
It lends itself to multiple obvious and somewhat less obvious optimizations vs. most other languages.
[...]
29% of CVEs in 2023 were related to overflow or memory corruption. I don't have a break down by language. >noooo they're all C!
Not necessarily. "Memory safe" languages can have bugs which either directly or indirectly result in an exploitable, unsafe situation. Go look at the Rust CVEs. There are multiple examples of overflow and memory corruption. The CPU is not memory safe, which means any "safe" language has to try and enforce safety using code written and verified by human beings. And human beings make mistakes.
[...] >It's moronic to downplay unsafe when 90% of vulnerabilities come from it
It's moronic to pull statistics out of your ass and present them as fact.
>to let programmers pretend that, under the hood, the C abstract machine is actually how modern processors work
Though lower level than other popular languages today, C is a high level language that abstracts the underlying ISA. The features it exposes are common to all ISAs and are, in fact, how modern processors work.
>More than anything, C is holding back how fast processors can actually be
An utterly ridiculous claim which presumes two very false things.
- That the general features of CPU architecture were in any way defined by C. With the exception of RISC principles, they predate C. And RISC was not developed for or because of C.
- That C is fundamentally different from other common languages in ways which related to the ISA. It is not.
>We're at the limit of how fast we can make C programs run
Another statement which presumes two false things.
- That CPU speed has stopped improving.
- That C is fundamentally different from other languages in a way which places a hard limit on CPU performance.
tl;dr - the blog post you're parroting is wrong on multiple levels anon. Spend some time studying computer architecture and then you'll realize how silly it is.
Any decent language compiler will use the same optimizations with some variance based on the language syntax and what the compiler can safely glean from it. Yet C solutions are generally faster at any optimization level including off.
[...] >It's fast because it's so old.
What the frick would that have to do with it?
>C doesn't really lend itself to any particular obvious optimizations
It lends itself to multiple obvious and somewhat less obvious optimizations vs. most other languages.
[...]
29% of CVEs in 2023 were related to overflow or memory corruption. I don't have a break down by language. >noooo they're all C!
Not necessarily. "Memory safe" languages can have bugs which either directly or indirectly result in an exploitable, unsafe situation. Go look at the Rust CVEs. There are multiple examples of overflow and memory corruption. The CPU is not memory safe, which means any "safe" language has to try and enforce safety using code written and verified by human beings. And human beings make mistakes.
[...] >It's moronic to downplay unsafe when 90% of vulnerabilities come from it
It's moronic to pull statistics out of your ass and present them as fact.
Any decent language compiler will use the same optimizations with some variance based on the language syntax and what the compiler can safely glean from it. Yet C solutions are generally faster at any optimization level including off.
[...] >It's fast because it's so old.
What the frick would that have to do with it?
>C doesn't really lend itself to any particular obvious optimizations
It lends itself to multiple obvious and somewhat less obvious optimizations vs. most other languages.
[...]
29% of CVEs in 2023 were related to overflow or memory corruption. I don't have a break down by language. >noooo they're all C!
Not necessarily. "Memory safe" languages can have bugs which either directly or indirectly result in an exploitable, unsafe situation. Go look at the Rust CVEs. There are multiple examples of overflow and memory corruption. The CPU is not memory safe, which means any "safe" language has to try and enforce safety using code written and verified by human beings. And human beings make mistakes.
[...] >It's moronic to downplay unsafe when 90% of vulnerabilities come from it
It's moronic to pull statistics out of your ass and present them as fact.
has nothing to do with anything.
> Though lower level than other popular ...
The underlying ISA is itself an abstraction, and a very different one from what the CPU is doing. The CPU does much under the hood for performance reasons, but doesn't expose any of it because the model that the "low level" languages use can't take advantage of it, because their own models don't express those features. Take cache for example, something controlled completely under the hood because C's abstract machine predates CPUs not having instant memory lookups. Too much of the world already ran on C by this point, and CPU cache was a performance gain, so we had to make it work. CPUs "actually worked" similar to the C abstract machine back in the 70s. Modern CPUs only pretend to still work this way.
> An utterly ridiculous claim...
It doesn't presume either of these things. CPUs didn't evolve in a vacuum. They weren't just suppose to be fast, they were suppose to run *current existing software* fast. The design of CPUs was absolutely influenced by the nature of the software running on them, regardless of who predates who.
> Another statement which presumes...
Again, I presume neither. We can always pack on more cores and more cache, but single thread throughput has stalled. And like caching, multithreading is another thing the C abstract machine had no conception of, which is why, for the longest time, C just considered it some weird OS-level thing where memory sharing was just magic trickery.
C once was a low level language. What changed was, CPUs got cool features that the C model didn't expose, so you could either: a) dig deeper and have the machine C targeted become itself a mid level language and implement said features under it, or b) tell everyone to rewrite all their C with a new C machine model. Guess which won.
unsafe as frick too
Skill issue.
skill issue
Go back sub-parjeet
>unsafe as frick too
It is unsafe as frick though, but so is rust
>waaah! this box has sharp corners!!
Stay safe, bubble boy.
90% of CVEs are memory leaks
More than 90% of your DNA is the same as a banana's.
fbpp
It's moronic to downplay unsafe when 90% of vulnerabilities come from it
90% of software comes from it.
100% of vulnerabilities come from you.
100% of memory leaks are you
Wrong, I don't use unsafe languages
Then 0% of useful code comes from you.
t. Unemployed
>employment is a sign of useful code
Employment means you write spyware of unscrupulous people. I reiterate.
You're a fricking moron
You never said I was wring.
To wit:
If you are 1) employed and 2) are worried about security, it means you have access to a database of information about people you should not be collecting in the first place. Ergo, 0% of the code you write is useful and 100% of it is harmful to society.
>it means you have access to a database of information about people you should not be collecting in the first place.
Yes, you're a moron and also wrong
I am not wrong. The vast majority of corporations out there are spy organizations, gathering information about people they have no business collecting and selling it. The only reason they get mad is because an equally unscrupulous hacker uses the same information for their own private purposes. There is no difference whatsoever between these people. You are exploiting people and trying to protect your own racket from secondary exploitation.
PEBKAC
So are power tools in the hands of small children and the mentally challenged
Just like sex with OP's mom
C# is better
Not the same category of languages
perhana, but c# has become incredibly fast (as long as you actually take advantage of those improvements) and it does it in a safe way (mostly)
fighting GC pauses is still a major issue
Shut up Michael, nobody cares about C shart.
I do
And yet I still see no contradiction.
kys javahomosexual
Thanks to compilers developers.
Any decent language compiler will use the same optimizations with some variance based on the language syntax and what the compiler can safely glean from it. Yet C solutions are generally faster at any optimization level including off.
>It's fast because it's so old.
What the frick would that have to do with it?
>C doesn't really lend itself to any particular obvious optimizations
It lends itself to multiple obvious and somewhat less obvious optimizations vs. most other languages.
29% of CVEs in 2023 were related to overflow or memory corruption. I don't have a break down by language.
>noooo they're all C!
Not necessarily. "Memory safe" languages can have bugs which either directly or indirectly result in an exploitable, unsafe situation. Go look at the Rust CVEs. There are multiple examples of overflow and memory corruption. The CPU is not memory safe, which means any "safe" language has to try and enforce safety using code written and verified by human beings. And human beings make mistakes.
>It's moronic to downplay unsafe when 90% of vulnerabilities come from it
It's moronic to pull statistics out of your ass and present them as fact.
>Go look at the Rust CVEs.
Rust isn't safe
>hurr rust isn't safe
It is claimed to be as long as you're not using unsafe.
But let me clarify it for you: go lookup the CVEs for ANY language (current and historical). You will find examples of buffer overflows and memory corruption. A surprising number in fact.
>It is claimed to be as long as you're not using unsafe
You need to go unsafe to write any meaningful low-level code
>I am not wrong
Yes you are, and also a schizo.
Why do you think I would accept the false judgments of a con artist who exploits people for a living?
>a con artist who exploits people for a living
Schizophrenic moron
You seem angry. Go get your shekels, but don't try to pretend you're a good person because you troon out with safe languages while ripping people off.
>while ripping people off
Whom have I ripped and how?
I already explained it. Go to your production database full of spy information that you collect about people and ask yourself whether it's worth a paycheck to betray people by collecting data on them like you're a mini-NSA unto yourself so you can use that information to your paymaster's advantage... then ask yourself whether you truly provide anything of value to the world.
Get a load of this moron lmao
You've already given yourself away by exposing your own motives. You are into these "safe" languages, you are concerned about buffer overflows primarily because the business you work for deals in information about a vast number of people it should not have. Simple as.
>because the business you work for deals in information about a vast number of people it should not have
Kek what a fricking idiot
You keep saying these things but you and I both know why you're so interested in this enterprisey java-esque/rustroon stuff.
You have a collection of information in your possession that is damaging to a large number of people but you don't want to give it up because it's useful to you and you want to exploit it to your advantage.
You are the fox guarding the henhouse.
Neither java nor rust. I deal mainly in .NET actually, while working for a Fintech, and I also have a few successful open-source projects.
>Fintech
And there you have it. I rest my case.
What about it?
>>It is claimed to be as long as you're not using unsafe
>You need to go unsafe to write any meaningful low-level code
I agree, but that just exposes the reason why C is needed.
But again: if you actually review the CVEs for a language, you will find buffer overflows and memory corruption in every one.
homie be spittin' facts, homes.
>What the frick would that have to do with it?
C wasn't "the fast language" back in the days, zoom zoom. It only became so because of compilers progress.
Writing fast code used to mean writing Fortran or straight ASM
C was always known for being fast. Do you really think companies were writing OSes in a slow language? I’m not only talking about the different flavors of UNIX. VMS, Windows, later versions of classic Mac OS (started in Pascal), BeOS (C and C++). I could go on.
>C was always known for being fast.
That's actually not true, when it first came out the compilers for it weren't very good.
Isn't OS's in C because the autists that made C rewrote Unix and C and it's just stuck ever since. I love some C but I wish they added in something like std::cin and std::String
>Do you really think companies were writing OSes in a slow language?
The advantage of UNIX and C was portability and small size, i.e. the whole "worse is better" thing.
"C programmer" used to be an insult in Amiga video game devs communities the same way autists now look down on Python users.
It's fast because it's so old. C doesn't really lend itself to any particular obvious optimizations nor does it have enough standardized hints to reliably signal for them.
*block your path* not so fast.
and still no serious alternative
They just did shit better in the olden days. My three decades old washing machine still going strong when my neigbours new Internet of Shit spinner became end-of-life within three years and broke down in five.
> still fast as frick
C is only fast because of the unreal resources that go into compilers and processors to let programmers pretend that, under the hood, the C abstract machine is actually how modern processors work. More than anything, C is holding back how fast processors can actually be. We're at the limit of how fast we can make C programs run.
>i'm going to parrot a dumb blog post
Anon I...
>C is only fast because of the unreal resources that go into compilers and processors
See:
>to let programmers pretend that, under the hood, the C abstract machine is actually how modern processors work
Though lower level than other popular languages today, C is a high level language that abstracts the underlying ISA. The features it exposes are common to all ISAs and are, in fact, how modern processors work.
>More than anything, C is holding back how fast processors can actually be
An utterly ridiculous claim which presumes two very false things.
- That the general features of CPU architecture were in any way defined by C. With the exception of RISC principles, they predate C. And RISC was not developed for or because of C.
- That C is fundamentally different from other common languages in ways which related to the ISA. It is not.
>We're at the limit of how fast we can make C programs run
Another statement which presumes two false things.
- That CPU speed has stopped improving.
- That C is fundamentally different from other languages in a way which places a hard limit on CPU performance.
tl;dr - the blog post you're parroting is wrong on multiple levels anon. Spend some time studying computer architecture and then you'll realize how silly it is.
> Something something blog post
Blog post?
> See:
[...]
>It's fast because it's so old.
What the frick would that have to do with it?
>C doesn't really lend itself to any particular obvious optimizations
It lends itself to multiple obvious and somewhat less obvious optimizations vs. most other languages.
[...]
29% of CVEs in 2023 were related to overflow or memory corruption. I don't have a break down by language.
>noooo they're all C!
Not necessarily. "Memory safe" languages can have bugs which either directly or indirectly result in an exploitable, unsafe situation. Go look at the Rust CVEs. There are multiple examples of overflow and memory corruption. The CPU is not memory safe, which means any "safe" language has to try and enforce safety using code written and verified by human beings. And human beings make mistakes.
[...]
>It's moronic to downplay unsafe when 90% of vulnerabilities come from it
It's moronic to pull statistics out of your ass and present them as fact.
has nothing to do with anything.
> Though lower level than other popular ...
The underlying ISA is itself an abstraction, and a very different one from what the CPU is doing. The CPU does much under the hood for performance reasons, but doesn't expose any of it because the model that the "low level" languages use can't take advantage of it, because their own models don't express those features. Take cache for example, something controlled completely under the hood because C's abstract machine predates CPUs not having instant memory lookups. Too much of the world already ran on C by this point, and CPU cache was a performance gain, so we had to make it work. CPUs "actually worked" similar to the C abstract machine back in the 70s. Modern CPUs only pretend to still work this way.
> An utterly ridiculous claim...
It doesn't presume either of these things. CPUs didn't evolve in a vacuum. They weren't just suppose to be fast, they were suppose to run *current existing software* fast. The design of CPUs was absolutely influenced by the nature of the software running on them, regardless of who predates who.
> Another statement which presumes...
Again, I presume neither. We can always pack on more cores and more cache, but single thread throughput has stalled. And like caching, multithreading is another thing the C abstract machine had no conception of, which is why, for the longest time, C just considered it some weird OS-level thing where memory sharing was just magic trickery.
C once was a low level language. What changed was, CPUs got cool features that the C model didn't expose, so you could either: a) dig deeper and have the machine C targeted become itself a mid level language and implement said features under it, or b) tell everyone to rewrite all their C with a new C machine model. Guess which won.
>still fast as frick
Why would it get slower with age?
all things considered THE low level programming language
too bad the duopoly of aids compilers