Have you ever wondered why there appears to be a strong vinyl revival? I remember when I was first introduced to music by a friend who owned a high-end Japanese brand over 35 years ago and being so impressed that I embarked on a journey of building and owning my own audio system based around a highly specified Linn Sondek LP12. At this time, Linn® pioneered source first with other high-end marques agreeing and developing products that were designed to extract as much information from the medium and introducing as a little as possible to influence this musical information. Graphic equalisers, tone and balance controls were banished in favour of simplicity and accuracy.
During my own journey, I realised that my trusted vinyl-based system was becoming obsolete at worst and niche at best. Like many other vinyl disciples I resisted the urge to buy CD until the availability of music forced me and many others to adopt the new medium.
Fortunately, advances in digital recovery and conversion led to products that were tolerable and, eventually immensely enjoyable but that debate of vinyl versus digital raged on and still does today.
Of course, inevitably, technology moved on severable paces with manufacturers dropping the Compact Disc in favour of higher resolution formats and streaming. As an audiophile, we see technology as bringing both benefits but also headaches. I succumbed to replacing my CD systems with a high-end digital streamer and spent months willing it to replicate the glory of the former setup. Alas it never did.
If you have been following Steve Elford's blogs, you will be familiar with our position on physical phenomena that exist in our systems. In particular digital circuits are inherently noisy with masses of electro-magnetic and radio frequency interference on tap. The addition of convenient but noisy displays further exacerbates the problem.
The result of all this is digital signals required for conversion are damaged by default. Whereas early focus for digital systems was spent on the integrity of the digital signal, latterly digital designers have focused on approximation algorithms, smoothing filters, noise shaping and correction techniques. All of these use complex mathematics and fast processing but the key words here are "smoothing", "shaping" and "approximation". In other words "Guesswork" albeit from an educated basis.
Does the guesswork achieve our goal then? Here's an analogy that you might find useful. In the early eighties, many manufacturers used graphic equalisers to allow for the shaping of the musical signal. Graphic equalisers became increasingly rich resembling a recording studio and enabling the user to modify the "shape" of the music but including sliders to amplify or exaggerate particular frequencies. Users started to realise that the graphic equaliser would need to be modified depending on the musical program and they quickly fell out of fashion.
Fast forward thirty years and we find that graphic equalisers are back but built into the digital conversion process. Some manufacturers advertise bass correction as an example but the process of what is occurring within DAC chips is what is really causing the damage. DACs now include error correction techniques that are, essentially, an educated guess. If the guesses were correct, then we're good to go but, if not, the result can be very bad at times or even unlistenable. What's worse is that you have no or little control of what is being done. So, if it's unlistenable, you don't play that track again.
So surely the focus should be on minimising the guesswork and preserving the quality of the digital stream? I will discuss this in my next blog and how the newly launched T Series streamer assumes that the DAC has to do nothing else except convert from Digital to Analogue without the guesswork.