neuralink

you should be able to solve this

Shopping Cart Returner Shirt $21.68

Nothing Ever Happens Shirt $21.68

Shopping Cart Returner Shirt $21.68

  1. 3 weeks ago
    Anonymous

    Literally what reason does anyone have to publish the source code under a FOSS compatible license even if they did figure it out

    • 3 weeks ago
      Anonymous

      >Literally what reason does anyone have to publish the cure for cancer for free even if they did figure it out

      • 3 weeks ago
        Anonymous

        >tricks you into making nerualinks business possible, for free
        how does IQfy worship this israelite
        >thanks for the billions, homosexual!

        • 3 weeks ago
          Anonymous

          Surely some revolutionary compression algorithm for sending brain signals can only mean good in the long run. What about competitors to Neuralink in 20, 30 years? Or maybe some other usage we're not even aware of yet.

          • 3 weeks ago
            Anonymous

            >compressing a gorillion unblockable ads that are projected straight into your brain because you live in a society and you are a good goy golem

          • 3 weeks ago
            Anonymous

            >What about competitors to Neuralink in 20, 30 years?
            The competency crisis will have regressed us back to 1950s tech by then.

          • 3 weeks ago
            Anonymous

            >1950s tech
            That would be an improvement. Unfortunately all we're going to get is blander and worse versions of what we already have, which is already how things are starting to be.

          • 3 weeks ago
            Anonymous

            HAHAHAHAHAHHAAHAHAHHAHAHAHA

        • 3 weeks ago
          Anonymous

          contribooot to science and yooomanity, goy! prove that you are the smartest goodest boy! do it for free!

        • 3 weeks ago
          Anonymous

          Because Stallman is a based autist and cares about freedoms

        • 3 weeks ago
          Anonymous

          > plz work for free becoz nobody at elon's collection of garbage corporations understands compression algorithms
          > i cannot understand how to compress a 20khz signal
          > 20khz
          lmfao.this is not only pathetic but incredibly embarrassing and fricking hilarious.

          there are people in this world that have no morals, backbone or any intelligence. they will simply give everything away for free to billionaires in the hope that said billionaire gives them some acknowledgement of their existence (that never happens).

      • 3 weeks ago
        Anonymous

        >bro just work for free for billionaires and make them even richer
        >it's for freedom as in free ass fricking by everyone while you live in poverty

        • 3 weeks ago
          Anonymous

          >freedom to sell your software
          not possible with proprietary shit

      • 3 weeks ago
        Anonymous

        >tricks you into making nerualinks business possible, for free
        how does IQfy worship this israelite
        >thanks for the billions, homosexual!

        because anyone else can use it free of charge, including competition
        What do you think open sourcing a cancer cure will do to the price of cancer treatments?

        • 2 weeks ago
          Anonymous

          >open source cure for cancer
          Lmao if someone made one the media would just shout it down as conspiracy theory quackery and tell people to listen to their doctor for medical advice, just like the whole "essential oils" shit.

  2. 3 weeks ago
    Anonymous

    i would run zstandard on it

    • 3 weeks ago
      Anonymous

      You don't think they tried that already, saar?

      • 3 weeks ago
        Anonymous

        they clearly show they only used zip on it thoughever

        • 3 weeks ago
          Anonymous

          the leaderboard has zip but not zstandard

          Yes, the almighty leaderboard shows everything they've ever tried. The neumorons over at neuralink have only heard of zip files before.

      • 3 weeks ago
        Anonymous

        the leaderboard has zip but not zstandard

  3. 3 weeks ago
    Anonymous

    >working for ~~*elon*~~ for free
    No thanks

    • 3 weeks ago
      Anonymous

      >200x lossless compression at <1ms and <10mW
      >oh btw you get nothing out of this

      this
      not doing uour homework FOR FREE

  4. 3 weeks ago
    Anonymous

    >Best submission will win a 12 months subscription of twitter blue

  5. 3 weeks ago
    Anonymous

    why didn't sar Zhang Et Al try anything else aside from zip?

    • 3 weeks ago
      Anonymous

      And why didn't they record time and power usage.

  6. 3 weeks ago
    Anonymous

    I don't do free labor.

    • 3 weeks ago
      Anonymous

      contribute to open sores saar, do it for humanity (Shlomo, >10 bodycount prostitutes, shitskins, and 8 billion mouth breathing Black personcattle)

  7. 3 weeks ago
    Anonymous

    >200x compression in <1ms using 10mW
    Fricking have a nice day you absolute moron.

    • 3 weeks ago
      Anonymous

      Sar don't be racist.

  8. 3 weeks ago
    Anonymous

    200x compression, in real time, while only using 10mW of power... These guys are nuts, it's not like people are not trying to get better compression algorithms, in fact I'd say it's almost impossible to get much better than what we currently have unless some German introvert guy revolutionizes our very understanding of reality

    • 3 weeks ago
      Anonymous

      I could think of much better uses for this german than compression algorithms.

    • 2 weeks ago
      Anonymous

      domain specific compression is often able to achieve insane performance compared to general compassion techniques
      for instance, a little knowledge about multibeam sonar sensitivity let's you change 1024 uint64,double into 1 double, 1 uint16, and 1024 float16, a total savings from 16384 bytes to 2058 bytes, or 87% compression.
      and while technically lossy, there's no actual data lost in the compression because the physical device isn't more sensitive than the compression loss

  9. 3 weeks ago
    Anonymous

    such a solution would be worth too much to give to elon for free

  10. 3 weeks ago
    Anonymous

    i dont get the lossless part. electrode data probably has clear patterns but theres just no way you can get much compression when it needs to be lossless

    • 3 weeks ago
      Anonymous

      This is baffling. They're going to have to compromise on one of their stipulations, demanding lossless compression is a weird line to draw on what I assume is already a nearly impossible set of requirements

      • 3 weeks ago
        Anonymous

        if you submitted an incredibly efficient lossy algorithm and demonstrated why those tradeoffs were worth it im sure you would get some recognition

        • 3 weeks ago
          Anonymous

          This is baffling. They're going to have to compromise on one of their stipulations, demanding lossless compression is a weird line to draw on what I assume is already a nearly impossible set of requirements

          any measurement data is going to be noisy thus impossible to get good lossless compression. but at the same time you can probably compress this kind of data 1000 fold and still use it for every practical application. synapses are slow as frick

          • 2 weeks ago
            Anonymous

            Yeah 10bit sampling at 20KHz seems kinda overkill to start with. It'd be good to know more about the signal characteristics and why lossy compression isn't acceptable, but I am not a neuroscience so what would I know?

        • 3 weeks ago
          Anonymous

          and I think this is the real plan
          getting nerds to produce really fricking good lossy algorithms in hopes of winning the 700k or a job with daddy elon then getting "lol actually not lossless but thanks for the code loser" back

  11. 3 weeks ago
    Anonymous

    this problem is not meant to be solved. the hardware team fricked up and expected the software team to fix it like they usually do. but the software team is like uh it's not doable. hardware team calls software team incompetent. software team posts problem on the internet expecting nobody to solve it.

  12. 3 weeks ago
    Anonymous

    >elon literally trying to be gavin belson irl
    kek

  13. 3 weeks ago
    Anonymous

    >do work for free
    lmao

    • 3 weeks ago
      Anonymous

      crazy ppl be crazy

  14. 3 weeks ago
    Anonymous

    >200x lossless compression at <1ms and <10mW
    >oh btw you get nothing out of this

  15. 3 weeks ago
    Anonymous

    Are their engineers moronic?
    You can “infinitely” compress anything if you make a large enough pattern dictionary to hold in a hashmap.

    • 3 weeks ago
      Anonymous

      I'm guessing the only 'tard here is you

      • 3 weeks ago
        Anonymous

        It’s true though. He didn’t say anything about the size of the program.

        • 3 weeks ago
          Anonymous

          Except it does, look at the page tard. The compression size includes the size of the encode and decode binaries

    • 3 weeks ago
      Anonymous

      O(1) lookups can take longer than 1ms on a sufficiently large set, moron

  16. 3 weeks ago
    Anonymous

    Just zip it 91x times

    • 3 weeks ago
      Anonymous

      holy frick it worked, thanks

    • 3 weeks ago
      Anonymous

      delet this, I want to get my reward of nothing ahead of you.

    • 3 weeks ago
      Anonymous

      2.2^91=200
      Great, it was worth getting up today too

  17. 3 weeks ago
    Anonymous

    none of the links in that pic are clickable.

  18. 3 weeks ago
    Anonymous

    Opus, 128 Kbps VBR

    • 3 weeks ago
      Anonymous

      Has to be lossless

  19. 3 weeks ago
    Anonymous

    Guys, I solved the compression problem. I just delete the data they give me and randomly generate data on demand as they ask for it back. There's a non-0 probability that the randomly generated data exactly matches the original.

    • 3 weeks ago
      Anonymous

      wtf surely that violates conservation of information laws

  20. 3 weeks ago
    Anonymous

    FLAC

    • 3 weeks ago
      Anonymous

      >FLAC
      I don't expect it to meet the requirements, but I came here curious to see if anyone has tried it.

  21. 3 weeks ago
    Anonymous

    Just train an LLM and use that to compress it. Ez.

  22. 3 weeks ago
    Anonymous

    I've worked with electrophysiological waveform data in the brain before, and one thing to know about it is that it is a method with a low signal to noise ratio. i really doubt there's much value in having lossless data of raw waveform data, because slight deviations don't matter. when identifying neurons you're looking at a cluster of waveforms that exhibit similar characteristics in local areas. i doubt a lossy compression system is really impacting the final results in any way that matters.

    • 3 weeks ago
      Anonymous

      Yes, filter the data and turn it into a new type of language. You can always add new words later when main stream usage begins.

    • 3 weeks ago
      Anonymous

      >doubt there's much value in having lossless data
      That was my first thought too.

  23. 3 weeks ago
    Anonymous

    only genuine IQs >145 are tackling problems like these and IQfy is full of coping midwits

    • 3 weeks ago
      Anonymous

      >make an algorithm that violates data entropy
      >"only smart people would dare work on this"
      I think it takes a special kind of stupid to go buy headlamp fluid, and this is the same idea.

    • 3 weeks ago
      Anonymous

      >midwits
      You over estimate the average intelligence of IQfy.

    • 3 weeks ago
      Anonymous

      Even above baseline anons know about that thing called entropy, and why such strict requirements are absolutely batshit insane.
      Either you drop the lossless requirement, which can be ok-ish still, or you can only converge to a theoretical limit, which usually gets computationally expensive, therefore infeasible at such criterias. (200x my ass lmao, unless somebody cracks the code of human mind ofc.)

  24. 3 weeks ago
    Anonymous

    So did all these posts help get your homework done op?

  25. 3 weeks ago
    Anonymous

    Have they tried the middle-out jelcompression method?

  26. 3 weeks ago
    Anonymous

    >all these morons thinking you'd be giving it away for free

    • 3 weeks ago
      Anonymous

      > upload source code
      yes you are

      • 3 weeks ago
        Anonymous

        If you think you're not getting a high 6 figure job for solving the problem then you've got another think coming.

        • 3 weeks ago
          Anonymous

          >6 figure job for an algorithm that will make them potentially millions
          You got scammed by corporations into working for free via their "show us your open source contribution" scam.

  27. 3 weeks ago
    Anonymous

    I am fairly confident I will have a fully working solution very soon. Three months probably. Six months definitely.

  28. 3 weeks ago
    Anonymous

    how is power consumption measured? I guess this runs on a fpga/mcu? Interesting challenge but 200x compression seems very hard to achieve.

    • 3 weeks ago
      Anonymous

      The format is amenable to PAQ on an ASIC. Let's start working on the problem. IQfy should solve it. Musk is a great man. He's building the future.

  29. 3 weeks ago
    Anonymous

    P'takh!

  30. 3 weeks ago
    Anonymous

    >just losslessly compress random data x200
    lol

    • 3 weeks ago
      Anonymous

      Electrode readings have high temporal resolution and exhibit patterns or correlations that can be exploited for compression.

      • 3 weeks ago
        Anonymous

        So does music and flac or alac are only 60% over wav on average. Please give a better reason as why you think there would be 200x worth of savings here for comparatively little computation. With zip they did 2x, with. Again 1024 electrodes at 20Khz.

        • 3 weeks ago
          Anonymous

          >Genetics compression algorithms (not to be confused with genetic algorithms) are the latest generation of lossless algorithms that compress data (typically sequences of nucleotides) using both conventional compression algorithms and specific algorithms adapted to genetic data. In 2012, a team of scientists from Johns Hopkins University published the first genetic compression algorithm that does not rely on external genetic databases for compression. HAPZIPPER was tailored for HapMap data and achieves over 20-fold compression (95% reduction in file size), providing 2- to 4-fold better compression much faster than leading general-purpose compression utilities.
          It's a biological system with constantly repeating signals. You can go very far with a simple grammar-based code. It's not worth doing for free, but it can be done.

          • 3 weeks ago
            Anonymous

            Not sure if the same thing would work here, genetic data is quantised and broken into discrete units, the data here is continuous waveforms, so unless the electrode data has obvious, reliable and universal units it’s not that amenable to a tailored grammar system. Audio compression algorithms would probably be a more promising avenue to pursue.

          • 3 weeks ago
            Anonymous

            Then obvious, reliable, and universal units are exactly what we need to get things rolling.

          • 3 weeks ago
            Anonymous

            Most of the signals are probably irrelevant. How does spectral decomposition work on neuronal singals?

          • 3 weeks ago
            Anonymous

            irrelevant if the morons are demanding lossless compression

  31. 3 weeks ago
    Anonymous

    Does it have to be lossless?

  32. 3 weeks ago
    Anonymous

    I got this.
    Now where's my cracked version of WinRAR?

  33. 3 weeks ago
    Anonymous

    just use 64kbps mp3. the human brain can't tell the difference.

    • 3 weeks ago
      Anonymous

      you're a special kind of fricking moronic. it's admirable.

      Does it have to be lossless?

      yes. though i imagine there's already a lot of loss if signals are capped at 20khz. lossy compression would not be ideal for this because you'd need a more robust and complex error correction scheme, which would probably eat away at any gains you made compressing the signal. stream it raw or use some commonly available (and lossless) compression algorithm. not many choices.

  34. 3 weeks ago
    Anonymous

    the only paper I could is this https://digital.csic.es/bitstream/10261/286262/1/lowelectro.pdf
    is its really ridiculous what they are demanding x200 @ <10mW in <1ms is impossible.

  35. 3 weeks ago
    Anonymous

    >thinking the solution is some absurd compression at 200x and not developing some kind of biological solution to mutate the primate's brain that can allow for more expressive brainwaves
    this is a posting for umbrella corp not elon's company

  36. 3 weeks ago
    Anonymous

    I define the bit 1 to be the compressed file. for all other formats the byte stream starts with 0 and has the normal data

  37. 3 weeks ago
    Anonymous

    They're expect a xx trillion dollar evaluated solution to be handed over on a silver platter for $0?

    • 3 weeks ago
      Anonymous

      it will look good on your cv

      • 3 weeks ago
        Anonymous

        Its not just that it looks goon on your cv, IT IS YOUR CV for the job.

    • 3 weeks ago
      Anonymous

      morons like you dont understand the implication

      • 3 weeks ago
        Anonymous

        Pray tell. I'd just patent my new compression algorithm that can achieve %99,5 compression rate with sub 1ms delay on an esp32 tier hardware, and forever terrorize them in court if they ever use it without giving me at least an %8 stake at the company.

    • 3 weeks ago
      Anonymous

      yes. they are that incompetent.

      it will look good on your cv

      > working for free for a billion dollar corporation full of talentless fricking Black folk looks good on a cv
      never did and never will.

      >leaderboard
      poor fricker got scammed out of potentially a billion dollar idea he can expand on

      that's the plan, anon. sadly this world is full of suckers and morons.

  38. 3 weeks ago
    Anonymous

    <the only usecase for neuralink is implanting it in your cat so it can find its way home

  39. 3 weeks ago
    Anonymous

    Using a wireless signal from your brain implant is the stupidest idea ever.

  40. 3 weeks ago
    Anonymous

    >200x compression for 20KHz signals
    LMAOOOO
    You can't even get anywhere near that with a NN.

  41. 3 weeks ago
    Anonymous

    here's a spectrogram of the data from 1 electrode
    they're asking for 200mbps of fricking NOISE to be losslessly compressed to 1mbps

    • 3 weeks ago
      Anonymous

      whats it look like in log tho

    • 3 weeks ago
      Anonymous

      the moronic thing is the whole point of neuralink is to overcome the superposition/source isolation problem by just sampling at ridiculous spatial resolutions (ie a shit ton of densely packed electrodes)

      so who the frick is asking for lossless compression here? likely another team from the ones actually designing the hardware
      with the density they've achieved they probably only need 4 bits. Early on they even had some PhDs talking about just streaming out binary spike trains. What happened to that?

  42. 3 weeks ago
    Anonymous

    Project the bytes onto the metal using ancestral african healing techniques

  43. 3 weeks ago
    Anonymous

    >We need to get our compression up to 200x, can you help?
    >200x seems like kind of a lot. What's your current compression ratio using best known available methods?
    >2.2x

    • 3 weeks ago
      Anonymous

      >compression ratio using best known available methods
      >zip
      Is this bait?

  44. 3 weeks ago
    Anonymous

    just find a sparse basis for the signal and project onto that. I'll collect my $$$ now.

  45. 3 weeks ago
    Anonymous

    >No constraints on space
    Just use a 1TB dictionnary. Done.

  46. 3 weeks ago
    Anonymous

    >leaderboard
    poor fricker got scammed out of potentially a billion dollar idea he can expand on

  47. 3 weeks ago
    Anonymous

    Work for free?
    No thanks.

  48. 3 weeks ago
    Anonymous

    >200x compression
    >Less than 1ms
    >Using less than 10mW
    This doesn't seem possible.

    • 3 weeks ago
      Anonymous

      they already have insane spatial resolution, they don't need the temporal resolution. Whoever made this spec is moronic. When you're actually embedded in the brain and each electrode is sampling 1-4 neurons at most you can trade-off on temporal resolution.

      What they really should just do is run a spike train generator (threshold checker) and just transmit timestamps to their receiver with a tag maybe to indicate which of the neurons is most likely. Again there are probably only 4 neurons at most, its way better than the bullshit you have to perform with scalp EEG which aggregates tons of neurons through a lossy scalp and skin medium

  49. 3 weeks ago
    Anonymous

    sent 😉

  50. 3 weeks ago
    Anonymous

    > hello sars
    > i need you to prove p=np
    > must be able to run on arduino and use 1mw of power
    > must have 1ms latency

  51. 3 weeks ago
    Anonymous

    I'm gonna solve it and hand it to them on a GPLv3 platter

  52. 3 weeks ago
    Anonymous

    >200x
    They want 143 mb compressed down to 700kb lossless? Only possible if there are extremely large sections of repeating or blank data

    • 3 weeks ago
      Anonymous

      whoever spec'd this is moronic and doesn't understand what they're asking or information theory
      probably an unpaid intern

    • 3 weeks ago
      Anonymous

      >Only possible if there are extremely large sections of repeating or blank data
      Neurons fire repetitively for hours at consistent rates. They're essentially FM transmitters.

  53. 3 weeks ago
    Anonymous
  54. 3 weeks ago
    Anonymous

    >some rando finally does it
    >it actually works
    >neuralink starts using it in production, without checking anything
    >(why would they? They couldn't be bothered to actually write it in the first place)
    >...
    >neuralink moves onto the human trial phase
    >pic related
    >turns out the magical decompression routine was just a RNN simulating the monkey datastreams

    • 3 weeks ago
      Anonymous

      we have it coming

  55. 3 weeks ago
    Anonymous

    The military proved that animals and humans can influence random number generators. Instead of recording EEG signals, the device should be running a number of RNGs representing a low resolution map of the brain. Improbable fluctuations representing activity. This data is infinitesimally smaller. Easy.
    >but anon, isn’t that a glorified ouija board
    Yeah, but you can give it a gay marketing name, like Neurawink.

  56. 3 weeks ago
    Anonymous

    if (fileTooBig) {
    makeFileSmall(compressionLevel: .x200);
    }

    where do I redeem, bros?

  57. 3 weeks ago
    Noir

    Without being given details, we can make some assumptions about the intended application of this. From the power spec and realtime requirement, this is probably intended to be implant or similar embedded firmware. I think its both stupid and impossible to do just in software. Wants all of: lossless, very low latency, very low power. You cannot construct a software compression algorithm that will meet all three of those. It would be far better to do it in hardware, design a high frequency analog signal with some bandwidth reserved for parity checking and error correction. That spec is essentially 1000 channels, reporting a 1-1000 number, 20,000 times a second. So a basic 20 milion cycles per second of data copying. paring that down lossless is going to take some MAD discrete mathematics.

    • 2 weeks ago
      Anonymous

      No amount of mathematics is going to let you store 200 uncorrelated bits with 1 bit.
      To put it simply: either their data is extremely redundant or the so called "challenge" is impossible.

  58. 3 weeks ago
    Anonymous

    Do you think the elites will wear Neuralinks?
    If not, it's a trap

  59. 3 weeks ago
    Anonymous

    >lossless
    just drop this requirement
    none of your nerves are flashing at 20kHz
    99.5% of the data is useless

    how to do this:
    >fast Fourier transform
    >on-board calculate out which electrodes actually need high frequency data (small neural nets can probably do this)
    >chop off the rest of the data
    >transmit
    ez
    I will take my $1000000 prize money now

  60. 3 weeks ago
    Anonymous

    >2011+13
    >recordings of early brain-computer-interface neuron activity
    >zip of .wav files
    cool

  61. 3 weeks ago
    Anonymous

    who cares?
    until using it for advertising and data mining is a capital offense with no exceptions even for "consent", anybody who gets one is an abject moron

Your email address will not be published. Required fields are marked *