Why are redditors incapable of understanding big O notation?

Everything the redditor said here is wrong.

Skip to content
# Why are redditors incapable of understanding big O notation? Everything the redditor said here is wrong.

###

Everything the redditor said here is wrong.

i made the post again!

It's not though? I mean mathematically big-0 is a function limiting threshold but it's not a horrible way to think about it, intuitively. You don't teach calc 1 students about Darboux Sums.

It's complexity growth not just time. It could be memory or some other resource.

And what difference does that make?

>analysis

Most people take applied calc before they take analysis, at least where I am

>You don't teach calc 1 students about Darboux Sums

Darboux sums are elementary. Literally one of the most basic things in all of analysis, basically calculus 1.

i have a meth diploma and never herd about darboo sums.

It's just a name given to a standard definition of integration. There's no way you have a math diploma and have never heard of integrals.

I mean, technically it's a separate object used in defining integrals.

It's part of the definition of integrals.

In the same way a limit is part of the definition of derivative

Limits are used all the time outside of the context of a derivative. Darboux sums are not outside of the context of integrals.

Fair enough, I still wouldn't say they're just a part of the definition though

Where are they used outside of defining and computing integrals?

I don't know, probably not anywhere. This is pedantic and I'm gonna stop

>This is pedantic

I do not understand why you say that. Why do you call it pedantic?

Because it's inconsequential and semantic

How is the question of whether Darboux integrals are used outside of integration semantic? Seems like a definite question with a definite answer.

> inconsequential

But you started the discussion by saying

> I mean, technically it's a separate object used in defining integrals.

That's still true. They're not the definition of an integral, they are an object defined for the purpose of defining an integral. Not the same

yes, ive herd integerls.

it had a name???

>Everything the redditor said here is wrong.

Please explain, I'm curious.

>O(1) is constant time

This is wrong.

>which means it doesn't take longer as the input size increases

This is wrong.

Do you get the idea or should I continue?

I'm brainlet plz continue 🙁

> O(logn) is logarithmic time, which means as the input size increases it takes a logarithmically small amount more time

This is wrong.

> O(n) is linear time, which means it takes a constant factor of time proporrtional to the input size

This is wrong.

> O(nlogn) is linear times logarithmic time, which is between linear and quadratic

This is wrong

>um... it is... le wrong!

Ok, why is it wrong? Surely you're able to substantiate your claims.

He cannot, he's just baiting. The problem is that in every single one of these threads posted over the last few days, it's a constant

> The redditor is wrong

> How

> It's just wrong okay?

No real debate happening, nothing is said that anyone could learn from, it's just a dead thread where OP comes here and makes a fool of himself while everyone throws popcorn

You must understand, I am braindead and those threads are the only way I get attention except for my profile on grindr.

>troony makes everything worse by his mere presence

Well at least more and more people hate trannies every single day

For example, f(n) = 1 - 1/n is O(1) but is not constant.

so this is the power of IQfy...

?

Case in point.

Anyone with a tiny bit of cs experience can show you that it is constant by virtue of N being literally irrelevant. Since N is an integer, even for N=1 then the 1/n part becomes irrelevant. Nevermind when N approaches infinity(i.e. number that is big enough), which will be the majority of cases

Pic related function is O(1).

>troonymos

YWNBAW

I already AM a woman but ok.

what's the chud approved web based grapher? i'm always looking for new sites to try.

By reducing it to (x^2 sin(x^2))/x^2 (which you can do since we do not care about constants, we are computing functions for a really large X value) you can see that it essentially simplifies itself to sin(x^2), which is always comprised in the [1,0] interval(I'm deliberately cutting out the negatives here because they can't exist), so f(n) =O(1)

and that function runs in constant time. What's the problem here?

>constant time, which means it doesn't take longer as the input size increases

in the limit

OK, that's what I thought. You're nit picking. But yes, you're right.

Nobody said it was constant.

Constant time complexity class has never implied a constant function.

It's still called constant time though in case you didn't know.

>Nobody said it was constant.

Redditor in the OP did.

> Constant time complexity class has never implied a constant function.

Redditor explained that for him constant time means constant function.

>Redditor in the OP did.

Where?

>Redditor explained that for him constant time means constant function.

Where?

>Where?

>which means it doesn't take longer as the input size increases

Read homie

Clearly it is you who doesn't get big O notation. Big O describes the asymptotic growth of a function. Now remind me, what is the limit of 1/n as n goes to infinity again?

The limit of 1/n as n goes to infinity is 0. What's your point?

Okay, I'll lead you on a bit more.

Now we know the limit. What function bounds the growth of Lim 1 + 1/n?

Not 1

The growth of 1? 1 doesn't grow, it's a constant. Any function f(x) which satisfies f(x)>=1 bounds 1 from above.

>Any function f(x) which satisfies f(x)>=1 bounds 1 from above.

No, it has to be strictly greater than

>No, it has to be strictly greater than

A bound does not have to be strictly greater. Take the bound for [0,1]. This clearly is 1 as any value in [0,1] is either smaller than 1, or it is 1.

>That's wrong though, 1 doesn't bound the function

Clearly meant here that 1 bounds lim n -> infty f(n) = 1.

>Just what do you think O(1) means?

It means that f(n) is bounded by a functions in the set of O(1) (constant function) as n goes to infinity. I.e:

lim n -> infty |f(n)| <= f(x) (let's say f(x) = 1).

>A bound does not have to be strictly greater

It doesn't, but the function never reaches 1, it's always greater than it. 1 bounds the limit but not the function. 1.000001 bounds the function after a certain point though.

>It means that f(n) is bounded by a functions in the set of O(1) (constant function) as n goes to infinity. I.e:

>lim n -> infty |f(n)| <= f(x) (let's say f(x) = 1).

This is wrong. Plenty of O(1) functions do not have limits as x -> infinity.

>This is wrong. Plenty of O(1) functions do not have limits as x -> infinity.

Fair enough, I left that out but indeed for this reason big O is defined as being bounded by g(x) after some x_0. (see

)

>It doesn't, but the function never reaches 1, it's always greater than it. 1 bounds the limit but not the function. 1.000001 bounds the function after a certain point though.

We were discussing the limit of the example f(x), which is 1 and therefore bounded by 1.

>Also your definition is circular.

It is not, as 1 is by definition in O(1). After this we can construct the hole set from this one element.

>You are also illiterate. Good job.

please elaborate.

>which is 1 and therefore bounded by 1.

The limit of a function being 1 doesn't imply that the function itself bounded by 1

it does in big-O notation, except it can be any fixed time, it doesn't have to be 1

>It is not, as 1 is by definition in O(1). After this we can construct the hole set from this one element.

It looks like f(_)->2 is never in your circular definition of O(1).

Also your definition is circular.

>O(1) means that f(n) is bounded by a function in the set of O(1).

You are also illiterate. Good job.

They use Theta for tight bounds.

You're getting it!

Indeed every function >= 1 bounds it! But it is common practice to say that the asymptotic growth of 1 + 1/n is O(1), but indeed, it is in fact also bounded by the set of functions in O(n) etc.

Just what do you think O(1) means?

What does that mean?

in the limit, the computation does not take any longer

I asked you what you mean by "in the limit", and you just repeated in the limit. What does that mean?

ah, sorry.

as the input n increases, the time complexity ceases to grow

But it doesn't for 1 - 1/n. What's an n after which you think it ceases to grow?

8 on its side

Not a member of the domain of the function.

Does 2 + sin(n) cease to grow as n increases?

asymptotically yes

Wrong. Try to prove it (you won't be able to)

I indeed am incapable of formally proving that max(sin(x)) = 1, but I don't really care about that. It's quite obvious.

If I'd hazard a geometric guess, it would be something about the two sides of the triangle coalescing into 1, hence the ratio becoming 1.

anyway, how does formally proving something that is obvious matter to whatever you're on about?

I didn't ask you to formally prove max(sin(x)) which is already obvious for real x. I asked you to prove that sin(x) ceases to grow asymptotically as x increases, a demonstrably false claim which you made.

>I asked you to prove that sin(x) ceases to grow asymptotically as x increases

so you concede that max(sin(x)) = 1? there's your answer then

to add to that:

are you perhaps confusing convergence and asymptotic growth?

>Does 2 + sin(n) cease to grow as n increases?

That's clearly bounded above by 3 you dumbass.

> Indeed every function >= 1 bounds it!

That's wrong though, 1 doesn't bound the function

>O(1) is when functions don't have variables

Who are you quoting?

How does max(sin(x)) = 1 imply that x ceases to grow asymptotically? You just restated your claim which I denied. Learn to read.

No.

>How does max(sin(x)) = 1 imply that x ceases to grow asymptotically?

??? because |sin(x)| is thus obviously bounded by 1. this is so obvious that I have to assume you're either a troll or a bot, so I'll stop conversing with you now. good night:)

How is it not constant?

Can someone explain to a peabrain? The computer will load the same amount of data and perform the same instructions each time regardless of how big N is. Or are we assuming N can have more than 1 item?

>Can someone explain to a peabrain?

Change f(n) a bit to see it.

For example: f(n) = 10000 - 10000 / n

function what(arr) {

for (var i = 10000; i > (10000 / arr.length); i--) {

console.log(i);

}

}

// arr of size 10000, 10000 ops

// arr of size 5000, 5000 ops

// arr of size 1, 1 op

// arr of size 20000, 10000 ops

// arr of size 30000, 10000 ops

// arr of size 40000, 10000 ops

The ops are not constant, it depends on the size of arr.

But you can write f(n) <= k * g(n),

for all n, exists k =10000, g(n) = 1.

Thus, by definition, O(1).

elaborate

He's wrong, it is true. 0(1) means that there exists a constant such that for any n greater than a certain point the function is less than the constant

That's exactly what it means. Explain to me how OOP is wrong without just saying it is.

>tfw incrementing an integer in binary is O(n!)

It's also O(n^n)

it's truly amazing that computers can even do math at all with that kind of bound

You lost me there.

Refer to

Still wrong, same example applies.

No, constant factorial time is right. It uses 0 space.

1! = 1

The science has been settled already.

>It uses 0 space.

wrong, you have to load the code on memory

There is no code. There is no memory. It’s negative space. Black hole(1)

blackholes aren't real

its called big O because thats what she gets when I'm done with her black hole

big O is simply an upper bound, so almost any operation is O(n!n^n^n) or whatever

it's not though because processors have instructions like AND, OR, XOR. but also they have ICs specifically for math.

https://www.baeldung.com/cs/arithmetic-logic-unit

>he doesn't know

Why are IQfyners incapable of coming up with a topic of discussion that doesn't involve reposting from social media?

In my opinion as some random CS student, that sounds about right

Maybe there's an extremely autistic way to say that he's wrong but for applied math (CS), that's kind of what it comes down to

Clearly you have no idea what big O notation denotes.

the definition isn't particularly long.

although as usual, mathematical notation make it somewhat cryptic

Yet somehow nobody except me on IQfy (and possibly you) understands the definition. Just read this thread. People are completely baffled as to what big O means.

what part do you think do people understand?

I prefer the definition

f in O(g) <=> limsup_{n->infty}(f(n)/(g(n)) < infty

O(n^2) is equal to O(1). Why is this so hard for people to understand?

It's not. n^2 is in O(n^2) but not in O(1).

You’re right. I meant O(1!). Fixed.

so the whole point of the thread is that the reddit guy didn't say "the worst case is bound by a constant independently of the output" and instead he say "the function is constant time"

I'm not gonna fall for this bait again.

It's good bait though.

quantum computers will achieve true constant time

What a ridiculous thing to say

he means that time will constantly be wasted on getting quantum computers to do anything useful, and AGI wont even bother because it can more easily manipulate human brains to function as "cloud quantum computers"

A computer science professor and transsexual was teaching a class on the Lambda Calculus.

"Before the class begins, you must get on your knees and worship Alan Turing and accept that he was the most highly-rigorous Computer Scientist the world has ever known!"

At this moment, a brave, intuitionist, wildbergian computer scientist who had produced 1500 actual programs and understood the necessity of algorithmic thinking stood up and held up a reel of tape memory.

"How much memory does this Turing Machine have?"

The arrogant professor smirked quite Infinitistly and smugly replied "a Turing Machine requires infinite memory!"

"Wrong. There is only a finite amount of memory in this reel of tape."

The professor was visibly shaken, and dropped his japanese chalk and copy of SICP. He stormed out of the room crying those infinist crocodile tears. There is no doubt that at this point our professor, Cardinal troonystein, wished he had pulled himself up by his bootstraps and actually programmed instead of being a sophist infinity schizo. He wished so much that he had a gun to shoot himself from embarrassment, but the bullet would take infinite steps to reach his head!

The students applauded and all studied John Backus that day and accepted Dennis Ritchie as their lord and savior. An eagle named “Imperative Programming” flew into the room and perched atop the SICP book and shed a tear on the chalk. Wildberger's videos were watched several times, and God himself showed up after descending a finite amount from heaven.

The professor lost his tenure and was fired the next day. He died of gay plague AIDS and was expelled from the paradise Wildberger had created for all eternity.

I don't care. All I know is O(1) good, O(n) bad.

not always, if you know for sure that you won't have a big input, O(n) might be faster/better than o(1), you always have to test before making a judgment, which in turn makes big o notation somewhat useless

Yeah but if the difference between the two is not big enough I also don't care.

>All I know is O(1) good, O(n) bad.

Depends on what you're doing. For searching, O(n) isn't good. For sorting, it's superb.

the thing most people get wrong is O(1) vs O(log n)

if your "direct" equation has e.g. something to the power of n, you use O(log n) multiplications

https://cp-algorithms.com/algebra/binary-exp.html

print( pow(a, n) ); # O(log n)