Why are redditors incapable of understanding big O notation? Everything the redditor said here is wrong.

Why are redditors incapable of understanding big O notation?
Everything the redditor said here is wrong.

Unattended Children Pitbull Club Shirt $21.68

Black Rifle Cuck Company, Conservative Humor Shirt $21.68

Unattended Children Pitbull Club Shirt $21.68

  1. 6 days ago
    Anonymous

    i made the post again!

  2. 6 days ago
    Anonymous

    It's not though? I mean mathematically big-0 is a function limiting threshold but it's not a horrible way to think about it, intuitively. You don't teach calc 1 students about Darboux Sums.

    • 6 days ago
      Anonymous

      It's complexity growth not just time. It could be memory or some other resource.

      • 6 days ago
        Anonymous

        And what difference does that make?

        >You don't teach calc 1 students about Darboux Sums
        Darboux sums are elementary. Literally one of the most basic things in all of analysis, basically calculus 1.

        >analysis
        Most people take applied calc before they take analysis, at least where I am

    • 6 days ago
      Anonymous

      >You don't teach calc 1 students about Darboux Sums
      Darboux sums are elementary. Literally one of the most basic things in all of analysis, basically calculus 1.

      • 6 days ago
        Anonymous

        i have a meth diploma and never herd about darboo sums.

        • 6 days ago
          Anonymous

          It's just a name given to a standard definition of integration. There's no way you have a math diploma and have never heard of integrals.

          • 6 days ago
            Anonymous

            I mean, technically it's a separate object used in defining integrals.

          • 6 days ago
            Anonymous

            It's part of the definition of integrals.

          • 6 days ago
            Anonymous

            In the same way a limit is part of the definition of derivative

          • 6 days ago
            Anonymous

            Limits are used all the time outside of the context of a derivative. Darboux sums are not outside of the context of integrals.

          • 6 days ago
            Anonymous

            Fair enough, I still wouldn't say they're just a part of the definition though

          • 6 days ago
            Anonymous

            Where are they used outside of defining and computing integrals?

          • 6 days ago
            Anonymous

            I don't know, probably not anywhere. This is pedantic and I'm gonna stop

          • 6 days ago
            Anonymous

            >This is pedantic
            I do not understand why you say that. Why do you call it pedantic?

          • 6 days ago
            Anonymous

            Because it's inconsequential and semantic

          • 6 days ago
            Anonymous

            How is the question of whether Darboux integrals are used outside of integration semantic? Seems like a definite question with a definite answer.
            > inconsequential
            But you started the discussion by saying
            > I mean, technically it's a separate object used in defining integrals.

          • 6 days ago
            Anonymous

            That's still true. They're not the definition of an integral, they are an object defined for the purpose of defining an integral. Not the same

          • 6 days ago
            Anonymous

            yes, ive herd integerls.

          • 6 days ago
            Anonymous

            it had a name???

  3. 6 days ago
    Anonymous

    >Everything the redditor said here is wrong.
    Please explain, I'm curious.

    • 6 days ago
      Anonymous

      >O(1) is constant time
      This is wrong.
      >which means it doesn't take longer as the input size increases
      This is wrong.
      Do you get the idea or should I continue?

      • 6 days ago
        Anonymous

        I'm brainlet plz continue 🙁

        • 6 days ago
          Anonymous

          > O(logn) is logarithmic time, which means as the input size increases it takes a logarithmically small amount more time
          This is wrong.
          > O(n) is linear time, which means it takes a constant factor of time proporrtional to the input size
          This is wrong.
          > O(nlogn) is linear times logarithmic time, which is between linear and quadratic
          This is wrong

      • 6 days ago
        Anonymous

        > O(logn) is logarithmic time, which means as the input size increases it takes a logarithmically small amount more time
        This is wrong.
        > O(n) is linear time, which means it takes a constant factor of time proporrtional to the input size
        This is wrong.
        > O(nlogn) is linear times logarithmic time, which is between linear and quadratic
        This is wrong

        >um... it is... le wrong!
        Ok, why is it wrong? Surely you're able to substantiate your claims.

        • 6 days ago
          Anonymous

          He cannot, he's just baiting. The problem is that in every single one of these threads posted over the last few days, it's a constant
          > The redditor is wrong
          > How
          > It's just wrong okay?
          No real debate happening, nothing is said that anyone could learn from, it's just a dead thread where OP comes here and makes a fool of himself while everyone throws popcorn

          • 6 days ago
            Anonymous

            You must understand, I am braindead and those threads are the only way I get attention except for my profile on grindr.

          • 6 days ago
            Anonymous

            >troony makes everything worse by his mere presence
            Well at least more and more people hate trannies every single day

        • 6 days ago
          Anonymous

          For example, f(n) = 1 - 1/n is O(1) but is not constant.

          • 6 days ago
            Anonymous

            so this is the power of IQfy...

          • 6 days ago
            Anonymous

            ?

          • 6 days ago
            Anonymous

            Case in point.

            For example, f(n) = 1 - 1/n is O(1) but is not constant.

            Anyone with a tiny bit of cs experience can show you that it is constant by virtue of N being literally irrelevant. Since N is an integer, even for N=1 then the 1/n part becomes irrelevant. Nevermind when N approaches infinity(i.e. number that is big enough), which will be the majority of cases

          • 6 days ago
            Anonymous

            Pic related function is O(1).

          • 6 days ago
            Anonymous

            >troonymos
            YWNBAW

          • 6 days ago
            Anonymous

            I already AM a woman but ok.

          • 6 days ago
            Anonymous

            what's the chud approved web based grapher? i'm always looking for new sites to try.

          • 6 days ago
            Anonymous

            By reducing it to (x^2 sin(x^2))/x^2 (which you can do since we do not care about constants, we are computing functions for a really large X value) you can see that it essentially simplifies itself to sin(x^2), which is always comprised in the [1,0] interval(I'm deliberately cutting out the negatives here because they can't exist), so f(n) =O(1)

          • 6 days ago
            Anonymous

            and that function runs in constant time. What's the problem here?

          • 6 days ago
            Anonymous

            >constant time, which means it doesn't take longer as the input size increases

          • 6 days ago
            Anonymous

            in the limit

          • 6 days ago
            Anonymous

            OK, that's what I thought. You're nit picking. But yes, you're right.

          • 6 days ago
            Anonymous

            Nobody said it was constant.
            Constant time complexity class has never implied a constant function.
            It's still called constant time though in case you didn't know.

          • 6 days ago
            Anonymous

            >Nobody said it was constant.
            Redditor in the OP did.
            > Constant time complexity class has never implied a constant function.
            Redditor explained that for him constant time means constant function.

          • 6 days ago
            Anonymous

            >Redditor in the OP did.
            Where?
            >Redditor explained that for him constant time means constant function.
            Where?

          • 6 days ago
            Anonymous

            >Where?
            >which means it doesn't take longer as the input size increases
            Read homie

          • 6 days ago
            Anonymous

            Clearly it is you who doesn't get big O notation. Big O describes the asymptotic growth of a function. Now remind me, what is the limit of 1/n as n goes to infinity again?

          • 6 days ago
            Anonymous

            The limit of 1/n as n goes to infinity is 0. What's your point?

          • 6 days ago
            Anonymous

            Okay, I'll lead you on a bit more.
            Now we know the limit. What function bounds the growth of Lim 1 + 1/n?

          • 6 days ago
            Anonymous

            Not 1

          • 6 days ago
            Anonymous

            The growth of 1? 1 doesn't grow, it's a constant. Any function f(x) which satisfies f(x)>=1 bounds 1 from above.

          • 6 days ago
            Anonymous

            >Any function f(x) which satisfies f(x)>=1 bounds 1 from above.
            No, it has to be strictly greater than

          • 6 days ago
            Anonymous

            >No, it has to be strictly greater than
            A bound does not have to be strictly greater. Take the bound for [0,1]. This clearly is 1 as any value in [0,1] is either smaller than 1, or it is 1.

            > Indeed every function >= 1 bounds it!
            That's wrong though, 1 doesn't bound the function

            >That's wrong though, 1 doesn't bound the function
            Clearly meant here that 1 bounds lim n -> infty f(n) = 1.

            Just what do you think O(1) means?
            [...]
            What does that mean?

            >Just what do you think O(1) means?
            It means that f(n) is bounded by a functions in the set of O(1) (constant function) as n goes to infinity. I.e:
            lim n -> infty |f(n)| <= f(x) (let's say f(x) = 1).

          • 6 days ago
            Anonymous

            >A bound does not have to be strictly greater
            It doesn't, but the function never reaches 1, it's always greater than it. 1 bounds the limit but not the function. 1.000001 bounds the function after a certain point though.

          • 6 days ago
            Anonymous

            >It means that f(n) is bounded by a functions in the set of O(1) (constant function) as n goes to infinity. I.e:
            >lim n -> infty |f(n)| <= f(x) (let's say f(x) = 1).
            This is wrong. Plenty of O(1) functions do not have limits as x -> infinity.

          • 6 days ago
            Anonymous

            >This is wrong. Plenty of O(1) functions do not have limits as x -> infinity.
            Fair enough, I left that out but indeed for this reason big O is defined as being bounded by g(x) after some x_0. (see

            the definition isn't particularly long.
            although as usual, mathematical notation make it somewhat cryptic

            )

            >A bound does not have to be strictly greater
            It doesn't, but the function never reaches 1, it's always greater than it. 1 bounds the limit but not the function. 1.000001 bounds the function after a certain point though.

            >It doesn't, but the function never reaches 1, it's always greater than it. 1 bounds the limit but not the function. 1.000001 bounds the function after a certain point though.
            We were discussing the limit of the example f(x), which is 1 and therefore bounded by 1.

            Also your definition is circular.
            >O(1) means that f(n) is bounded by a function in the set of O(1).
            You are also illiterate. Good job.

            >Also your definition is circular.
            It is not, as 1 is by definition in O(1). After this we can construct the hole set from this one element.

            Also your definition is circular.
            >O(1) means that f(n) is bounded by a function in the set of O(1).
            You are also illiterate. Good job.

            >You are also illiterate. Good job.
            please elaborate.

          • 6 days ago
            Anonymous

            >which is 1 and therefore bounded by 1.
            The limit of a function being 1 doesn't imply that the function itself bounded by 1

          • 6 days ago
            Anonymous

            it does in big-O notation, except it can be any fixed time, it doesn't have to be 1

          • 6 days ago
            Anonymous

            >It is not, as 1 is by definition in O(1). After this we can construct the hole set from this one element.
            It looks like f(_)->2 is never in your circular definition of O(1).

          • 6 days ago
            Anonymous

            Also your definition is circular.
            >O(1) means that f(n) is bounded by a function in the set of O(1).
            You are also illiterate. Good job.

          • 6 days ago
            Anonymous

            They use Theta for tight bounds.

          • 6 days ago
            Anonymous

            You're getting it!
            Indeed every function >= 1 bounds it! But it is common practice to say that the asymptotic growth of 1 + 1/n is O(1), but indeed, it is in fact also bounded by the set of functions in O(n) etc.

          • 6 days ago
            Anonymous

            Just what do you think O(1) means?

            in the limit

            What does that mean?

          • 6 days ago
            Anonymous

            in the limit, the computation does not take any longer

          • 6 days ago
            Anonymous

            I asked you what you mean by "in the limit", and you just repeated in the limit. What does that mean?

          • 6 days ago
            Anonymous

            ah, sorry.
            as the input n increases, the time complexity ceases to grow

          • 6 days ago
            Anonymous

            But it doesn't for 1 - 1/n. What's an n after which you think it ceases to grow?

          • 6 days ago
            Anonymous

            8 on its side

          • 6 days ago
            Anonymous

            Not a member of the domain of the function.
            Does 2 + sin(n) cease to grow as n increases?

          • 6 days ago
            Anonymous

            asymptotically yes

          • 6 days ago
            Anonymous

            Wrong. Try to prove it (you won't be able to)

          • 6 days ago
            Anonymous

            I indeed am incapable of formally proving that max(sin(x)) = 1, but I don't really care about that. It's quite obvious.
            If I'd hazard a geometric guess, it would be something about the two sides of the triangle coalescing into 1, hence the ratio becoming 1.
            anyway, how does formally proving something that is obvious matter to whatever you're on about?

          • 6 days ago
            Anonymous

            I didn't ask you to formally prove max(sin(x)) which is already obvious for real x. I asked you to prove that sin(x) ceases to grow asymptotically as x increases, a demonstrably false claim which you made.

          • 6 days ago
            Anonymous

            >I asked you to prove that sin(x) ceases to grow asymptotically as x increases
            so you concede that max(sin(x)) = 1? there's your answer then

          • 6 days ago
            Anonymous

            >I asked you to prove that sin(x) ceases to grow asymptotically as x increases
            so you concede that max(sin(x)) = 1? there's your answer then

            to add to that:
            are you perhaps confusing convergence and asymptotic growth?

          • 6 days ago
            Anonymous

            >Does 2 + sin(n) cease to grow as n increases?
            That's clearly bounded above by 3 you dumbass.

          • 6 days ago
            Anonymous

            > Indeed every function >= 1 bounds it!
            That's wrong though, 1 doesn't bound the function

          • 6 days ago
            Anonymous

            >O(1) is when functions don't have variables

          • 6 days ago
            Anonymous

            Who are you quoting?

            >I asked you to prove that sin(x) ceases to grow asymptotically as x increases
            so you concede that max(sin(x)) = 1? there's your answer then

            How does max(sin(x)) = 1 imply that x ceases to grow asymptotically? You just restated your claim which I denied. Learn to read.

            [...]
            to add to that:
            are you perhaps confusing convergence and asymptotic growth?

            No.

          • 6 days ago
            Anonymous

            >How does max(sin(x)) = 1 imply that x ceases to grow asymptotically?
            ??? because |sin(x)| is thus obviously bounded by 1. this is so obvious that I have to assume you're either a troll or a bot, so I'll stop conversing with you now. good night:)

          • 6 days ago
            Anonymous

            How is it not constant?

            Can someone explain to a peabrain? The computer will load the same amount of data and perform the same instructions each time regardless of how big N is. Or are we assuming N can have more than 1 item?

          • 6 days ago
            Anonymous

            >Can someone explain to a peabrain?
            Change f(n) a bit to see it.
            For example: f(n) = 10000 - 10000 / n
            function what(arr) {
            for (var i = 10000; i > (10000 / arr.length); i--) {
            console.log(i);
            }
            }

            // arr of size 10000, 10000 ops
            // arr of size 5000, 5000 ops
            // arr of size 1, 1 op

            // arr of size 20000, 10000 ops
            // arr of size 30000, 10000 ops
            // arr of size 40000, 10000 ops

            The ops are not constant, it depends on the size of arr.
            But you can write f(n) <= k * g(n),
            for all n, exists k =10000, g(n) = 1.
            Thus, by definition, O(1).

      • 6 days ago
        Anonymous

        elaborate

        • 6 days ago
          Anonymous

          He's wrong, it is true. 0(1) means that there exists a constant such that for any n greater than a certain point the function is less than the constant

      • 6 days ago
        Anonymous

        That's exactly what it means. Explain to me how OOP is wrong without just saying it is.

  4. 6 days ago
    Anonymous

    >tfw incrementing an integer in binary is O(n!)

    • 6 days ago
      Anonymous

      It's also O(n^n)

      • 6 days ago
        Anonymous

        it's truly amazing that computers can even do math at all with that kind of bound

    • 6 days ago
      Anonymous

      It's also O(n^n)

      You lost me there.

      • 6 days ago
        Anonymous

        Refer to

        the definition isn't particularly long.
        although as usual, mathematical notation make it somewhat cryptic

        You’re right. I meant O(1!). Fixed.

        Still wrong, same example applies.

        • 6 days ago
          Anonymous

          No, constant factorial time is right. It uses 0 space.

          • 6 days ago
            Anonymous

            1! = 1

          • 6 days ago
            Anonymous

            The science has been settled already.

          • 6 days ago
            Anonymous

            >It uses 0 space.
            wrong, you have to load the code on memory

          • 6 days ago
            Anonymous

            There is no code. There is no memory. It’s negative space. Black hole(1)

          • 6 days ago
            Anonymous

            blackholes aren't real

          • 6 days ago
            Anonymous

            its called big O because thats what she gets when I'm done with her black hole

      • 6 days ago
        Anonymous

        big O is simply an upper bound, so almost any operation is O(n!n^n^n) or whatever

    • 6 days ago
      Anonymous

      it's not though because processors have instructions like AND, OR, XOR. but also they have ICs specifically for math.
      https://www.baeldung.com/cs/arithmetic-logic-unit

      • 6 days ago
        Anonymous

        >he doesn't know

  5. 6 days ago
    Anonymous

    Why are IQfyners incapable of coming up with a topic of discussion that doesn't involve reposting from social media?

  6. 6 days ago
    Anonymous

    In my opinion as some random CS student, that sounds about right
    Maybe there's an extremely autistic way to say that he's wrong but for applied math (CS), that's kind of what it comes down to

    • 6 days ago
      Anonymous

      Clearly you have no idea what big O notation denotes.

  7. 6 days ago
    Anonymous

    the definition isn't particularly long.
    although as usual, mathematical notation make it somewhat cryptic

    • 6 days ago
      Anonymous

      Yet somehow nobody except me on IQfy (and possibly you) understands the definition. Just read this thread. People are completely baffled as to what big O means.

      • 6 days ago
        Anonymous

        what part do you think do people understand?

    • 6 days ago
      Anonymous

      I prefer the definition
      f in O(g) <=> limsup_{n->infty}(f(n)/(g(n)) < infty

  8. 6 days ago
    Anonymous

    O(n^2) is equal to O(1). Why is this so hard for people to understand?

    • 6 days ago
      Anonymous

      It's not. n^2 is in O(n^2) but not in O(1).

      • 6 days ago
        Anonymous

        You’re right. I meant O(1!). Fixed.

  9. 6 days ago
    Anonymous

    so the whole point of the thread is that the reddit guy didn't say "the worst case is bound by a constant independently of the output" and instead he say "the function is constant time"

  10. 6 days ago
    Anonymous
  11. 6 days ago
    Anonymous

    I'm not gonna fall for this bait again.
    It's good bait though.

  12. 6 days ago
    Anonymous

    quantum computers will achieve true constant time

    • 6 days ago
      Anonymous

      What a ridiculous thing to say

      • 6 days ago
        Anonymous

        he means that time will constantly be wasted on getting quantum computers to do anything useful, and AGI wont even bother because it can more easily manipulate human brains to function as "cloud quantum computers"

  13. 6 days ago
    Anonymous

    A computer science professor and transsexual was teaching a class on the Lambda Calculus.

    "Before the class begins, you must get on your knees and worship Alan Turing and accept that he was the most highly-rigorous Computer Scientist the world has ever known!"

    At this moment, a brave, intuitionist, wildbergian computer scientist who had produced 1500 actual programs and understood the necessity of algorithmic thinking stood up and held up a reel of tape memory.

    "How much memory does this Turing Machine have?"

    The arrogant professor smirked quite Infinitistly and smugly replied "a Turing Machine requires infinite memory!"

    "Wrong. There is only a finite amount of memory in this reel of tape."

    The professor was visibly shaken, and dropped his japanese chalk and copy of SICP. He stormed out of the room crying those infinist crocodile tears. There is no doubt that at this point our professor, Cardinal troonystein, wished he had pulled himself up by his bootstraps and actually programmed instead of being a sophist infinity schizo. He wished so much that he had a gun to shoot himself from embarrassment, but the bullet would take infinite steps to reach his head!

    The students applauded and all studied John Backus that day and accepted Dennis Ritchie as their lord and savior. An eagle named “Imperative Programming” flew into the room and perched atop the SICP book and shed a tear on the chalk. Wildberger's videos were watched several times, and God himself showed up after descending a finite amount from heaven.

    The professor lost his tenure and was fired the next day. He died of gay plague AIDS and was expelled from the paradise Wildberger had created for all eternity.

  14. 6 days ago
    Anonymous

    I don't care. All I know is O(1) good, O(n) bad.

    • 6 days ago
      Anonymous

      not always, if you know for sure that you won't have a big input, O(n) might be faster/better than o(1), you always have to test before making a judgment, which in turn makes big o notation somewhat useless

      • 6 days ago
        Anonymous

        Yeah but if the difference between the two is not big enough I also don't care.

    • 6 days ago
      Anonymous

      >All I know is O(1) good, O(n) bad.
      Depends on what you're doing. For searching, O(n) isn't good. For sorting, it's superb.

  15. 6 days ago
    Anonymous

    the thing most people get wrong is O(1) vs O(log n)
    if your "direct" equation has e.g. something to the power of n, you use O(log n) multiplications
    https://cp-algorithms.com/algebra/binary-exp.html

    • 6 days ago
      Anonymous

      print( pow(a, n) ); # O(log n)

Your email address will not be published. Required fields are marked *