I just don't get how digital could ever reproduce an analog signal faithfully to the original information. Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way.
DMT Has Friends For Me Shirt $21.68 |
DMT Has Friends For Me Shirt $21.68 |
I just don't get how digital could ever reproduce an analog signal faithfully to the original information
it cant
>Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way
yes it will
You can have quantization noise in an ADC just as anyones bajillion dollar vacuum tube can also pick up the ham radio coommer’s noise as well. If the sampling is not done correctly you can have excess spurious free dynamic range, but that is the fault of the technician recording the sample, not the concept.
>I just don't get how digital could ever reproduce an analog signal faithfully to the original information
You need to realize just one thing: analog equipment can't do that either. If you think otherwise, that there's no high- or low-pass frequency attenuation in your gear, no matter how much you pay for it, you're an idiot.
>it cant
have compression.
>touch grass
>Plebbit tourist cant into greentext
You have to go back
it doesn't but it does, the differences are so negligible as to be not useful or really matter. Wait til you find out most measurements aren't exactly their number but could be several thousandths of an inch off.
The issue is that any thing that could ever receive, process or “perceive” an analog signal, regardless of whether that thing is analog or digital at the end, is somehow bandwidth limited by its receiver. There is a limit to the amount of information per second that can be conveyed over some medium with a given bandiwidth called the Shannon limit. Want to know what Shannon also determined was the fundamental unit of information? Bits. Once you sample with a resolution in time and bit width exceeding the Shannon/Nyquist requirement for a medium it doesn’t matter anymore. You have captured that information. A lossless format that exceeds the Shannon limit of human hear is all you need. The moronic kbps levels some “audiophile” smoothbrains are pushing now have lapped that long ago.
holy frick, I don't know who Shannon is but this post is based as frick, audiogays on suicide watch
Reee anyone that doesn't use ear buds and spotify is a audiogay.
>just listen to trash like me
Yeah no thanks
>muh golden ears
Giga copium.
>I don't know who Shannon is
2022 IQfy everybody
IQfy is a handful of zoomers arguing about what operating system has the best mascot.
None of them know what a function is.
None of them know any maths.
All they know is how to consume.
this basically. 44.1kHz sampling covers the entire human hearing spectrum while 16bit depth gives 96dB of dynamic range (65k possible values of the sound).
If there is a limiting factor to quality of digital audio systems, it's the output stage, not the digital data containing the audio itself.
>t. electrical engineer
>this basically. 44.1kHz sampling covers the entire human hearing spectrum while 16bit depth gives 96dB of dynamic range (65k possible values of the sound).
A digital recording has to be band-limited to work. This means anything outside of the limiting values is discarded. There is no digital recording that sounds as good as a live performance because live (analog) music is not band limited. That's why a band using speakers to amplify the sound of their instruments could not just be replaced with a high-bit FLAC and sound the same. You reality deniers are really tedious and annoying.
I can't tell if this is a troll or if people actually believe they can tell the difference between audio going over a wire.
A proper reel-to-reel master will always sound better than a digital approximation because of digital's inherent loss of information.
No - it will contain more information, it won't perceivably sound better. These are different things.
Not if you're playing it back to switching chink-fi gear, no.
no, that has more to do with the color recording to tape has on a sound, i remember butch vig doing a comparsion on gearbawdz a long time ago.
>shannon-nyquist
correct, 10 points to homosexualdor
>ham radio commer's noise
imagine
>t. zoomie on gaypods
>gucci gang gucci gang gucci ga-*BZZZZ*CQ CQ CQ W6COMR W6COMR*CHK*cci gang gucci gang
dude just make the delta go towards zero. God u engineers are so stupid.
infinitely small values don't exist in real life
>t. planck length
Number of bits merely determines the noise floor, the signal itself is always perfect.
>no matter how many bits you throw at it, will always alter the original waveform/information in some way.
technically true, but once you've thrown "enough" bits at it, it becomes "good enough" for literally anything.
he's talking about bit rate, not bit depth.
It's not merely "good enough", 24 bits is higher precision than physically possible. Everything after that is limited by the same factors a purely analog chain would be limited as well.
/thread
Did you know that square wave + low pass filter creates a perfect sine wave? Fourier is magic, man.
Yes here we go again. A ton of people with nothing but youtube video experience explaining how Digital somehow perfectly duplicates a direct copy of audio.
Nah, applied math degree here and grad degrees in computer engineering. Enjoy your gold HDMI cables and $10k headphone set up, though!
Holy shit!! Computer engineering AND applied math? Fricking-A. Gold does mak a good coating though. Ever look inside of n older tektronix oscilloscope and look at the contact fingers for volts/div and such? All gold weird.
>Nah, applied math degree here
So you admit to being an absolute moron who chose the pseudo-intellectual equivalent of "womens' studies".
>and grad degrees in computer engineering.
So you weren't smart enough to figure anything out on your own like most people who actually matter in this field did, but you got some paper. Right.
>Enjoy your gold HDMI cables and $10k headphone set up, though!
Claims to be "smart" because of credentials he likely accepted copious amounts of non-dischargeable debt to pay for... but actually makes a statement like this, where he thinks there is no difference in quality among HDMI cables... especially when the length exceeds 10 feet. Wow. These morons still exist in 2022.
why are you so mad though lol
As if humans would be able to tell the difference anyways between analog and digital
You can tell the difference, analog has a hiss.
the hiss is because analog audio is cursed as soon as it's plugged in to a normal, ground-looped source.
wireless/optical only battery powered audio that does not share power with a CPU that switches signals billions of times a second won't have hiss or so little you can't hear it unless the environmental noise floor hits below 0db or something like that.
>As if humans would be able to tell the difference anyways between analog and digital
People can, and this thread is full of people claiming there is no difference who have no HiFi gear or whose idea of HiFi is playing a FLAC instead of an MP3 (then telling us they can barely hear the difference) using their motherboard's onboard sound.
>You can tell the difference, analog has a hiss.
There's always baseline noise. The reason cheapo digital audio is perceived to sound better than analog is that even consumer-grade equipment has a better SNR than mid to high end analog gear... but the difference in sound is apparent to objective individuals.
It turns out humans aren't that good at noticing the differences
Especially when there's no difference.
Of course there is difference. You don't find perfect sine waves in nature. That's why (an ideal) analog recording will always be better than a digital.
There's literally no difference.
>You don't find perfect sine waves in nature. That's why (an ideal) analog recording will always be better than a digital.
First, non-sequitur, second any signal is a sum of perfect sine waves.
>any signal can be approximated by a sum of perfect sine waves
</nitpick>
It's not nitpick, it's not even an opinion, it's literally just false.
you'd need an infinite amount of sine waves to be able to be able to represent an arbitrary signal
Reality check: reality doesn't allow arbitrary signals.
Incorrect. Noise is not information.
>Incorrect. Noise is not information.
Noise is information; it's just not the desired information, and it's external to the original signal.
Analog has noise external to the original signal. A frickton of such noise in fact.
humans can't see past 24 fps btw
I can’t see past my own gut. I hope my dick is doing okay.
Nothing can replicate a signal perfectly.
With analog, you have unavoidable thermal noise.
With digital, you have unavoidable quantization noise.
Of the two, digital is the easier one to work with and improve, but analog can't be disregarded as it is necessary for all physical I/O.
TL DR: audiopedophiles get the rope.
it would be noise anyway outside the process anyway. Digitizing the music actually improves the quality.
No it doesn't
For every sample where the quantization moves you in a direction to cancel any noise there is equal probability of moving in a direction which increases the magnitude of the noise
Analog can't even produce analog faithfully. Recording implements degrade over time and the output varies in quality by whatever analog circuits sit between you and the data.
It's why computers can reproduce a digital signal exactly but there are such things as "good speakers" and "bad speakers".
>Analog can't even produce analog faithfully. Recording implements degrade over time and the output varies in quality by whatever analog circuits sit between you and the data.
I have an old reel-to-reel that is over 40 years old and it still works... and the music recorded onto reels of similar age still work fine. Keep introducing irrelevance to distract from the fundamental fact that a digital recording of an analog signal will always be an approximation, while an analog recording using quality equipment will capture detail that a digital recording cannot.
>still works
is not the same as
>reproduces the album in exactly the same way it did 40 years ago.
Whereas I can get a digital video from 20 years ago and it's exactly the same as it was then.
>Whereas I can get a digital video from 20 years ago and it's exactly the same as it was then.
False again. Analog master recordings don't degrade over time as long as they are kept in proper environmental conditions. Where digital has an advantage over analog is in duplication. It's easy to make 1:1 digital copies, whereas an analog copy degrades with each generation.
>proper environmental conditions
lol shut the frick up, billions of dollars are spent restoring old film, especially masters - you're just being a fricking hipster essentially arguing it doesn't degrade if you don't use it... which is the dumbest fricking argument I've heard today because not only is it inarguablly false, but it defeats the whole purpose to any claim of superiority.
Face it, analog devices and recordings fail over time. Tubes lose their warmth and they're so imperfectly made that you'll not get the same output from one play to the next, let alone years down the road.
Sampling at 40kHz with infinite precision = 20kHz and below signals reconstructed PERFECTLY.
Sampling with finite precision + dithering = perfectly reconstructed signal + some fixed amount of noise. That fixed amount of noise is lower than anything analog can possibly achieve.
It's all mathematically proven.
yes but can you tell the difference? Probably not.
>Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way.
Yes. You can't get the same signal out as you've put in, but it can be pretty damn close that your little monkey ears can't hear the difference, or, more likely, that your audio equipment can't play back such subtle differences.
The analog signal is already not a perfect representation of the original sound. A membrane needs to record the signal.
This is why tube amps will always sound better. They're analog.
All amplifiers are analog
Holy shit you guys really know nothing
not ALL amplifiers, class D is a switching amplifier
it's the exception, though, not the rule, they're used where maximum efficiency is needed, since they're the most efficient option
It's still an analog circuit
yea, you're right
switching != digital either
You just use math
>Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way.
Yes, and?
>all these morons who don't realize that analog gets affected by noise too
my gold plated triple shielded cable with jesus cum sheathing blocks all the noise, actually
Analog has MORE noise.
For any analog signal, you can't. But with audio for example, you can reasonably say humans won't hear frequencies above a given threshold (generally said to be around 20kHz) and if you sample at 2x that rate, you can reproduce it perfectly, tanks to the Nyquist–Shannon sampling theorem, which was actually proven by Shannon ( this is why you see audio files sampled almost always above 40kHz, or even much more, up to 100s of kHz). If you know information theory and Fourier transforms well enough to understand the proof, you can just read it, but it's a lot of math. I cannot yet understand it myself to be honest. And by the way, it's not like analog media are perfect too, they will get further from the original by thermal expansion, loss of weight, degradation of material... This will cause the higher frequencies (on a vinyl record) to essentially degrade into random noise over time. There is also a physical limit imposed on you by the size of the needle and so on. On the other hand if you sample your signals digitally, you can use all the well-known error correction schemes. The sad reality is, that no way of recording information is truly perfect, you always have a limit on how much information you can record, digital, or analog.
The proof is actually fairly simple when the idea clicks.
Comes down to the fact that if you sample a function with interval d, then its Fourier transform becomes periodic with interval 1/d. If the support of the original F-transform was small, then you can reconstruct it from the periodic version by stripping away all periods but one.
>if you sample at 2x that rate, you can reproduce it perfectly, tanks to the Nyquist–Shannon sampling theorem, which was actually proven by Shannon
In engineering we usually use the Nyquist-Shannon limit to set the upper bound on information, not to assert complete information beneath. The exactitude you're talking about would require perfect periodicity, or a filter with infinite support. It's unrealizable theoretical construct.
Not to support audiophools because nobody can actually hear the difference with audio that's gone through an ADC, because the loss tends to be well below the noise floor of the original recording.
>Not to support audiophools because nobody can actually hear the difference with audio that's gone through an ADC, because the loss tends to be well below the noise floor of the original recording.
Without comparing the original source to the recorded version, there is no way to know if a difference can be heard. You're making flimsy and idiotic presumptions that the noise floor is all that matters, or that when comparing digital to analog recordings you would not consider equal noise floors for both methods.
Digital records are band-limited which means that right from the outset you are excluding all frequencies beyond that limits you choose to apply. Sound in the "real world" is not band-limited, and it is entirely asinine to pretend that an analog recording of a signal will not be more complete than a digital recording of the same.
There many analogphiles who also enjoy digital mastered analog media. So perhaps they take issue not with the ADC side but the DAC side.
Can't roll off the frequency response within a factor of ten of the audible range without introducing an audible phase modulation.
That is why we sample at 1920000hz and call it a day regardless. Vinyl and other analog medium cannot accurately reproduce high frequencies either way, while digitally we can always just sample higher.
>000
192000hz *
Reel-to-reel can.
human hearing is band limited
all real-world analog audio systems are band limited
nothing has infinite bandwidth
Digital has finite resolution. Analog has infinite resolution.
Analog will always sound better than digital.
Analog has less resolution than digital. Analog is equivalent to 6-bit digital audio.
Incorrect. Digital throws away information and attempts to reconstruct it, analog contains all information.
digital does not attempt to reconstruct anything it throws away
>but muh discrete time signal throws away anything between samples!
i realise it's intuitive, but it doesn't, there's no need for infinite samples because we aren't trying to capture an infinite frequency range, we have proven the correlation between sample rate and captured frequency range, and have mathematical proof that anything under half the sample rate is captured perfectly
anything that would cause the reconstruction to differ from what is captured by the sampled MUST, literally MUST fall beyond half the frequency of the samples, and we simply sample at over twice the frequency limit of human hearing, 20KHz, that way everything within hearing limits is captured, perfectly.
you can argue that real-world dacs and filters aren't perfect, sure, neither are you analog gear, but increasing the sample rate won't fix anything, because that part is already as good as it will ever need to be, outside of bionic ears improving human hearing range
This is quite clear not true either. Play an 8 bit audio file and tell me you think it sounds like a vinyl record
It sounds better than vinyl.
https://www.audiocheck.net/blindtests_16vs8bit.php
I can easily pass it but that's why I don't use analog bullshit.
You're literally comparing two recordings of an 8 bit synthersiser. Why do you think is has that particular sound character?
No excuses
https://www.audiocheck.net/blindtests_16vs8bit_NeilYoung.php
No it doesn't. Google: SNR (Signal to noise ratio)
fourier transformation
consider the following
the noise floor for the human ear is around -90db
at that level of sensitivity the sound of gas molecules bumping into your ear hairs due to brownian motion is what limits you from hearing any quieter sounds.
a 16bit audio file has an resolution that, if theoretically played through a perfect speaker system, equates to a resolution that provides a minimum difference between frames of around 1/10th of the noise of brownian motion of hydrogen.
in the end, the biggest source of distortion, by orders of magnitude, is in your speakers, not your amp or the audio file. And a 16bit audio file should contain enough information if encoded well to have more resolution than you can physically hear by some simple physics principles.
Do you have a source for this? Interesting if true
>I just don't get how digital could ever reproduce an analog signal faithfully to the original information. Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way.
Hey, thanks for bringing this up so I didn't have to continue an archived thread. Your assessment is correct, but there is an underlying sect of butthurt pseudo-intellectuals who really need to believe that digital = perfect, always.
See my posts from:
Explanation for brainlets:
The alleged EE response:
This tool, as is generally the case with most engineers, is that their fixation of limited metrics or measurements of specific elements equates to an understanding of the whole... i.e. you try to determine the purpose of a car by inspecting the headlights and mirrors, and declare that because it has headlights and mirror it must be some kind of portable lighting machine.
A guy who doesn't understand "nyquist sampling" claims I don't:
And this is the main "argument" that they're fumbling to assert... the nyquist theorem, which essentially states that sampling an analog (continuous time) signal at a rate twice the highest frequency of the signal will yield sufficient data to reproduce the original signal.
The key factor here is in dealing with BAND LIMITED signals, meaning that the band width is limited to whatever the person decides should be the endpoints. In this case, if you limit the band width to 20 Khz from zero, then sample at a rate of at least 40 Khz, the sampled data should be enough to reproduce the information contained between 0 to 20 Khz.
Cont ...
Continuing:
The problem with claiming that nyquist sampling is a "perfect digital representation" is that in reality, sound is not band-limited...and "CD quality" was based upon the theory that people cannot typically hear sounds above 20 Khz...yet in reality, the sounds emitted in the "real world" by actual instruments have harmonics that exceed 20 Khz, and while we may not be able to hear them directly, we can hear them when they reflect (aka echo) as well as reverb. This is one of the reason an analog recording played on an analog amp will have a certain indescribable warmth and character that is lacking on the digital recording.
>more treble adds warmth
Full moron moment.
Source? I've not heard of reflections changing the frequency of sound waves before. I'm not denying it's true but this is quite an unusual claim and certainly not what most textbooks on the physics of waves assert
> we can hear them when they reflect (aka echo) as well as reverb
this makes no sense. If we can hear them at a live performance via echo then they must be distorted to be of frequency <20kHz and therefore will be captured by a recording device sampling at 44.1kHz. Why can't we hear it on digital then?
He's also saying adding treble adds warmth, don't assume his posts are logical.
you are a dunning kruger moron.
by your own analog/digital argument which is a brainlet not understanding their own biology, and the physics and maths behind the world, a vacuum tube amp and a class A / B silicon amp would be the same. You ARE paying for distortion, a vacuum tube amplifier introduces distortion to the sound that people think makes it ""warm"". This is very simple to model and simulate because funnily enough you can look at a datasheet for the frequency response of a pentode tube and compare it to a stage of silicon. I understand that I will not convince you of anything because of your dunning kruger brain but hopefully other people will heed this reply and completely ignore all of your useless opinions.
>a vacuum tube amp and a class A / B silicon amp would be the same
How could that be possible, when a Class AB amp relies on switching to produce its output while a tube provides a constant amplification?
You're just another moron who is stuck in denial, and whose pseudo-intellectualism has made him dumber.
a class A amp and a class B amp are both linear amplifiers. Vacuum tube amps are class A or class B typically. You don't know how anything works
>a class A amp and a class B amp are both linear amplifiers. Vacuum tube amps are class A or class B typically. You don't know how anything works
A MOSFET amp (digital) is fundamentally different from a tube amp in the manner they amplify a given signal. A digital amp uses electronic switching to control the current; tubes do not switch; they provide constant amplification to the signal at any given frequency for which they are mechanically able to. If you weren't clueless, you'd probably not have said anything.
You fundamentally misunderstand how transistors work.
>mosfet = digital
you should probably learn a bit more about the alternatives before you claim tubes are better
http://www.youtube.com/watch?v=0e_OUyGCaBs
with a bit of research you too can benefit from 60+ year old technological advancements
>How could that be possible, when a Class AB amp relies on switching to produce its output while a tube provides a constant amplification?
Literally not how it works, moron.
Do not you realise you are moronic? Do you not realise what the frequency domain is? Do you not realise that ALL signals are subject to noise? Do you not realise that quantization noise is often much lower than the analogue equivalent for the same task?
>Do not you realise you are moronic?
You seem to be.
>Do you not realise that quantization noise is often much lower than the analogue equivalent for the same task?
The fool has arrived, and he's missing the point...that digital recordings are always going to be approximations of the analog source, if they are in fact sampling an analog signal as the source.
You do realise that an amplifier cannot perfectly recreate a signal but with a greater amplitude? You do realise that it will always just be an approximation through the transfer function of the amp? You do realise all amps have filters that are designed to change, distort and cut off the original signal?
Sampling above Nyquist will always improve the SNR. Nyquist is the midwits dream.
>Sampling above Nyquist will always improve the SNR.
24 bit 44.1kHz has higher SNR than thermodynamically possible in our world, the frick else do you want.
i think all he saying is like if human can see 24fps then you want something like 60fps for your videos jsut to get yourself comfortably past it. I think that 16bit audio is probably sufficient, 24bit audio is definitely sufficient. The bigger enemy is people with no understanding of how these things work sharing their stupid opinions
I'm not talking about that you invertebrate gaymer. You ARE one of those people. SNR is a metric that can be calculated and is not some subjective experience. It does not only apply to audio, It can apply to any signal from the movements of the earth below our feet to millimeter radar and above.
This isn't really a thing. The only metric that matters is SNR. An analogue signal isn't at different discrete levels but is always continuous. Get a scope on any signal and you will very quickly find this is so.
SNR of analog audio is equivalent to 6 bit digital audio. If SNR is the metric you want to use then analog is the last thing you want. Actually the only metric you could attempt to use to prove analog is better than digital is bandwidth.
moron spews random bullshit, 2022 edition, episode 69420
How can you be this clueless.
>87251414
see
Bit depth=SNR. You plugging your ears and screaming "la la la I can't hear you" doesn't change that.
>Bit depth=SNR.
No it is not you moron. Signal to noise ratio depends on ALL noise sources in the signal chain. If you have a very crappy amplifier but a high bit depth ADC, your SNR is dominated by the amplifier, and not by the bit depth of the ADC. Actually bit depth is not even the sole source of noise in an ADC. For example sampling jitter and phase noise can be limiting factors.
We are not discussing the entire chain, we're discussing everything before the amplifier. Which is either analog medium or a file + DAC. In this context bit depth is exactly SNR(of course there are bad DACs that can't actually achieve that SNR). If we were to discuss the entire chain of course SNR would be affected by the amp, you could plug your DAC into a tube amp.
Which noise sources are you counting for the analog signal?
I just look at the multitone test, it shows every noise and distortion source combined.
>87251463
see
Bandwidth and SNR are intrinsically linked.
6 bit at 1hz is better than "analogue"? I don't think so. Is DSD better than your grandmothers vinyl collection? I think so!
Please don't get me confused for an analogue touting audiophile just because I disagree with you.
Explain DSD. Sampling above Nyquist improves signal quality.
DSD is a moronic format that has dithering baked in. The only difference in SNR here comes from the fact dithering extends further into ultrasonic range so you can have less dither in audible range. Your DAC converting PCM to 1 bit with the same kind of dither gives the exact same result without having to store all that noise in a file.
>6 bit at 1hz is better than "analogue"?
The professional reel to reel tape tops out at 13 bit. Consumer analog gear is right about 6.
>The professional reel to reel tape tops out at 13 bit. Consumer analog gear is right about 6.
You cannot ascribe "bits" to an analog recording because analog signals are continuous time, and effectively have "infinite" information between any two points on the sine wave. These apples to oranges comparisons of two fundamentally different mediums show a diverse ignorance.
>SNR of analog audio is equivalent to 6 bit digital audio.
This is a myth and a lie promoted by some homosexual on YT. When CDs first became available in the 80s and even into the 2000s, studio master recordings were reel-to-reel tape.
In the imaginary land of "analog audio is like 6-bit digital audio" that would mean all CDs would have sounded like shitty mix tapes.
>The only metric that matters is SNR.
Just shut up, fool. Seriously. Nobody with an IQ above 40 is going to claim that a single metric rules them all...only a buffoon out of dumb arguments would try to introduce an intellectually bankrupt idea such as this.
1 bit audio
You are delusional. Seek medical assistance.
>Imagines SNR is the only relevant factor. Keeps repeating himself.
BAKA
SNR and bandwidth is literally all there is to it to audio. You can't name anything else.
Format capabilities have nothing to do with what is actually put on them.
>You cannot ascribe "bits" to an analog recording because analog signals are continuous time
You can. Because bit describe SNR, not information "between points".
>and effectively have "infinite" information between any two points on the sine wave
This information is finite. Just like you can perfectly reconstruct an infinite line from two points you can reconstruct an analog signal with 2N numbers of samples.
>You do realise that an amplifier cannot perfectly recreate a signal but with a greater amplitude?
A proper Class A tube amp can.
No amplifier is noise free or perfectly linear.
>No amplifier is noise free or perfectly linear.
Irrelevant. An analog amp is not mechanically band-limited as to what it can output, while a digital amp must be in order for it to work.
>You do realise that an amplifier cannot perfectly recreate a signal but with a greater amplitude? You do realise that it will always just be an approximation through the transfer function of the amp? You do realise all amps have filters that are designed to change, distort and cut off the original signal?
Nobody said otherwise, but the same holds true for digital...so a digital recording which is an approximation is then played back on an amp which does its best to approximate the original signal... You'd have to be blind to fail to see how much potential information is simply not present.
An analog signal played back on an analog (tube) amp may still be missing information, but nowhere near the extent of the digital counterpart, because from the outset the digital recording did not set a limit on the signal's band width.
Since we're on the subject, why is it that signal processing is rarely taught in CS degrees? Most IT guys I've talked to have no idea that you could solve problems in the frequency domain instead of the time domain (or that such a thing exists at all). It seems like such a useful thing to know.
CS degrees are designed to pump out general-purpose programmers for the industry.
People who want to make algorithms go to the math department instead.
>People who want to make algorithms go to the math department instead.
Oh, is that how they sold you that bill of goods. Now you're just a barista who doesn't need to use the cash register.
bump
did someone post this yet?
itt: teenager learning about signals finds out that you can't take infinite samples
You can have the exact same argument against analog signals, because of noise
Analog is never perfect. That's why you have closet pedophiles spending their life insurance money on equipment made of fairy dust.
Digital is the exact same whether it's 1982 or 2022. The files on a CD will be exactly the same on all equipment, via TOSLINK or otherwise; it's the DAC that changes the sound.
I thought silver was a better conductor than gold
Gold is more ductile and thus easier/more materially efficient to spread thin enough to use as shielding
t. my ass
No. Gold resists tarnishing. Silver does not.
I contend that golden ears is not an innate talent, but is learned and trained over time like any sport activity.
Per Siltech the optimal combination is a base of pure silver alloyed with a bit of gold to fill in the micro cracks in the silver crystal structure which cause distortion. I have not been able to verify these claims due to the difficulty in building wire from scratch but defer to their stellar reputation.
Signal is just an abstract mathematic concept over some scalar over time. If the source of what the scalar describes is something in the real world, it has to obey real world rules, like for example being speed capped under the speed of light.
>what is Nyquist rate
A tube amplifier is inherently analog and does not rely on switching of any kind, that's why they sound better. They have infinite resolution.
Simple as that.
Solid state amps have no switching either, moron. Only class D has switching, which is an entirely different beast from regular solid state amps. And they beat tube amps anyway lol.
>They have infinite resolution.
You seem to believe in magic.
>A tube amplifier is inherently analog and does not rely on switching of any kind
Neither do non-class D transistor amplifiers
>They have infinite resolution.
Resolution is a concept that only applies to the digital domain, not the analog one.
A low resolution does result in a high noise floor, but analog transistor/tube amplifiers also have a noise floor. If the resolution of your digital amplifier is high enough, the noise floor is lower than the noise floor of the analog circuitry and hence not distinguishable/performance limiting.
Also important to consider is not just the noise floor, but distortion. And in this tube amplifier absolutely get blown away by transistors, whether digital or not.
>>A tube amplifier is inherently analog and does not rely on switching of any kind
>Neither do non-class D transistor amplifiers
Not sure if trolling or if you're just megamoron. A transistor is literally as switch, and all digital amps work by switching. Transistors were the original replacements for tubes before MOSFETs became a thing.
>Resolution is a concept that only applies to the digital domain, not the analog one.
False again. Resolution is the measurable level of detail and/or precision a particular medium can achieve. Our eyes, being fully analog, have optical resolution. As do camera lenses. This simple fact is why so many of you morons were duped into believing earth is a ball - you do not understand the basics of how things actually work.
Analog resolution is effectively and substantially higher than digital; a 6-bit per sample digital recording will not contain nearly as much original information as the analog equivalent recording of the same source.
>transistors are switches
hi again, IQfy
>A transistor is literally as switch, and all digital amps work by switching.
Wrong.
>A tube amplifier is inherently analog
you think solid-state amps aren't analog? huh?
Here's your tube amp, bro.
Here's your infinite resolution bro.
Ah, the measurebating chinkfi shill has arrived.
>you can't measure resolution because...YOU JUST CAN'T, OKAY?
>Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way.
You are completely correct. It does. You lose high frequency information, and thus it is key that you sample higher than the highest frequency you are interested in.
For audio for example, you want to capture everything up to 20ish khz. Anything above that we cannot hear anyways, so why retain this useless information at the cost of higher power-consumption (faster sampling goes at cost of higher power consumption) and higher required memory to store it?
For more info read up on the Nyquist–Shannon sampling theorem.
Class AB and no tubes??? What wizardry is that???
this guy explains it all https://www.youtube.com/watch?v=cIQ9IXSUzuM
ah crap that's what I meant to post
Via metrology resolution can be superimposed over any media setting a modulation threshold
I have tinnitus
I always hear noise
fun fact: speaker membrane takes time to move from point a to point b and by that time you would've introduced enough noise to make quantization entirely irrelevant - the only time when it maters is in theoreticall ideal circuit where the output device takes 0 time to move between states
which will never exist
You know analog signals are many pure sinewaves piled together. If you can, with twice the sampling rate of the target frequency, get that accurately, you are fine. In fact, all the small peaks and other information that are ocurring in between the samples, or what can be conveyed from the digital reproduction is actually higher frequency than 20000hz. If you take a fourier transform of a sinewave a 20000hz sinewave and you sprinkle some weird peaks and other details in the waveform, you'd actually be adding content that is above those 20000hz. A square shape on a 20000hz sinewave (of course not lower frequency) has an actual frequency that is higher than 20000hz. So that means that that information that is above that we can sample, wasn't actually audible anyways. A hard filter above 20000hz would also filter those artifacts regardless.
What happens with all those other artifacts that are in between the samples troubled me at first, I was thinking, well no you are not capturing those, your resulting output will look slightly different, but then I understood that those artifacts, if happening between the period of a 20000hzsinewave ARE HIGHER FREQUENCY and the quantization simply does not have to care about them.
analog can't reproduce the original information perfectly either
there's just too much interference to be perfect
and even that assumes the recording was perfect to begin with, which is also not true
once a sound wave is emitted, that exact wave is lost
144hz monitors are supperior despite the fact that eye sample rates are ~60hz.
If you've ever seen THE AVIATOR you'll recognize audiophiles are a breed of obsessive compulsives who cannot regulate themselves. They really think they have ears made out of God's fabric and anything they listen to MUST be perfect or they are UNCLEAN.
this, lossless gays are just obsessive compulsives, lossless is only useful for archiving, and archivists are also obsessive compulsives
having good hearing seems like more trouble than it's worth, seems to lead to OCD
something like an LP can't even produce the same signal two times in a row because the needle will cause tiny abrasions that will alter the recording as it passes over it
the ideal way to use an LP is to make a digital recording of the first playback, that way you can enjoy it as it's peak quality as many times as you like
My N just study Nyquist-Shannon's theorem and stop being a b***h.
it can, up to the nyquist frequency, and down to the fixed noise floor determined by the bitdepth
quantisation only affects the noise floor, and it's predictable, you can just set the noise floor where you want it
You know nothing of inversion
OP discovers Nyquist theorem
>Seems like the quantization process, no matter how many bits you throw at it, will always alter the original waveform/information in some way.
It does. It just that it eventually gets "good enough".
Plus, once the digital signal is reproduced, it's also affected by several physical effects, some on purpose, some not, that can smooth out and/or distort the signal that reaches your brain signal. Because of that not even a pure analog recording can be reproduced 100% accurately
>I just don't get how digital could ever reproduce an analog signal faithfully to the original information.
It doesn't have to, it just has to reproduce it with a sufficient level of precision that a human couldn't possibly tell the difference, e.g. redbook audio standard (16-bit depth, 44,100Hz sampling rate, which no human can ABX from higher res files).
Wait until he learns that switching mode power supply has less noise than transformers, now that's a doozy.
not op but really?
it's amazing how much of this subject is full of intuitive truths
Earlier SMPS were shit but all state of the art power supplies are switching, not just because of their efficiency. You can do much better filtering with them.
Well, shit. Barely any of this was explained in my signals and systems/digital and hybrid control courses. Starting to make sense. I need to brush up on this shit.
>Barely any of this was explained in my signals and systems/digital and hybrid control courses
You are aware you were supposed to study for your classes outside the lecture hall too, right?
True, but loss of signal to noise can be much worse in an an all analog case if the circuit wasn't made to a dreadfully high (and expensive to produce) spec. This was the reason that digital computers became favored after the era of analog computers.