DSP code box - flawed?
Posted: Mon Jun 01, 2020 7:17 pm
This might cause controversy. I was asking myself, why I have so many difficulties with the dsp code editor for so long. And I think I spotted two main reasons. But first a little bit about me.
I come from the graphics area. Game development, logo design, 3D graphics (using Cinema4D), etc. What's used there a lot is something called "pixel shader". A shader is a specialised small program (mostly running on the graphics cards, for example vertex shaders deal with 3D-geometry), and a pixel shader acesses an endless stream of pixels throughout the lifetime of the application, be it a game, Cinema4D or Photoshop.
A pixel is a 32 bit value (just like an audio sample in Flowstone) and to fill a (in this example) FullHD resolution (a "frame"), you need over 2 million of them. Games and applications mostly run at 60 frames per second (fps), so that's 124,416,000 (124 million) pixels per second. Compare that to the 44,100 samples per second of audio samples, and as a game developer you mildly smile.
I created pixel shaders for DirectX (the interface to the graphics card) of various tasks, I had to take care of speed and efficiency. Of course. For a graphics card the resolution is not so important. More important is, how many shaders are used, and what kind of shaders. Pixel shaders are the most demanding of them, and I once made a test with a very complex shader algorithm, that I applied to over 58,000 objects. I had to turn the framerate down to 30 (so "just" 62 million pixels per second), but that rate was stable. On hardware from 10 years ago!
In short: I know the concept of "one sample at a time", I am trained in creating efficient and speed optimized code, and I dealt with this concept on a much more demanding level.
So why do I have those issues? I think it is two main points. From the examples you can see, that I tend to visualize my algorithms. So far I failed to manage that for dsp code.
And secondly, the dsp editor feels unfinished. I always have the impression that it was planned to let it do much more, and that preperations were made in this regard, that now complicate things. But then it was never finished and left in that awkward state.
In a pixel shader I can directly address the current pixel. I don't need to define a stream, as that is the whole purpose of a pixel shader - working on a pixel stream! So you do something like (call it current, this, self, doesn't matter for this example) current.getRed() and I get the value of the red channel of this pixel.
But that isn't all. Any pixel shader is also aware of the context! I can easily access any pixel of the current frame, just by pixel(x, y).getRed(), for example. This is a read access only, of course. Only the current pixel can be altered. But the good thing about this is, that I can alter the current pixel based on its neighbours! This is used for all kinds of useful stuff, like blur algorithms, color correction and interpolations. No latency is introduced, since it is all one frame that is about to be drawn (just like the underlying buffer in Flowstone, from which the current sample is drawn). I can calculate distances, angles, color predictions and much more just from this little feature. And there is much more. All of it is logical.
The DSP code editor however is illogical. Although the last sample is stored in a buffer, I can't access it. Instead it needs to be stored a second time, inside the DSP editor with a float definition. It also introduces latency, if I need access to other samples than the current. Time consuming, illogical and not efficient.
Given the comparisons, what are your thoughts?
Examples:
ellipse pixel shader
https://www.dropbox.com/s/gvvfwe1m56l1e ... x.png?dl=0
visualization of an algorithm that sets an object a relative distance apart from a circular area that encompasses a defined group of objects
https://www.dropbox.com/s/ydpqvv89n2rgs ... e.png?dl=0
application that builds patterns with self-programmed pixel shaders (and cubic interpolation color range)
https://www.dropbox.com/s/3x98ztyi7vp6222/1.png?dl=0
...and example renderings created with above application
https://www.dropbox.com/sh/qrvcif6mfjv6 ... 4HnBa?dl=0
C-DEX, automatic realtime color palette animation
https://www.dropbox.com/s/7n88imi66jgvs ... n.png?dl=0
a pixel shader applied to over 58,000 objects, running at 30 fps - 8 years ago!
https://www.dropbox.com/s/u6uyqiofbdkqs ... 3.png?dl=0
top down 2D water shader
https://www.dropbox.com/s/dqylddzub1xkqhw/Water.fx?dl=0
the hardware of that time (from 10 years ago)
https://www.dropbox.com/s/5p7hmy0z5yva2 ... o.png?dl=0
I come from the graphics area. Game development, logo design, 3D graphics (using Cinema4D), etc. What's used there a lot is something called "pixel shader". A shader is a specialised small program (mostly running on the graphics cards, for example vertex shaders deal with 3D-geometry), and a pixel shader acesses an endless stream of pixels throughout the lifetime of the application, be it a game, Cinema4D or Photoshop.
A pixel is a 32 bit value (just like an audio sample in Flowstone) and to fill a (in this example) FullHD resolution (a "frame"), you need over 2 million of them. Games and applications mostly run at 60 frames per second (fps), so that's 124,416,000 (124 million) pixels per second. Compare that to the 44,100 samples per second of audio samples, and as a game developer you mildly smile.
I created pixel shaders for DirectX (the interface to the graphics card) of various tasks, I had to take care of speed and efficiency. Of course. For a graphics card the resolution is not so important. More important is, how many shaders are used, and what kind of shaders. Pixel shaders are the most demanding of them, and I once made a test with a very complex shader algorithm, that I applied to over 58,000 objects. I had to turn the framerate down to 30 (so "just" 62 million pixels per second), but that rate was stable. On hardware from 10 years ago!
In short: I know the concept of "one sample at a time", I am trained in creating efficient and speed optimized code, and I dealt with this concept on a much more demanding level.
So why do I have those issues? I think it is two main points. From the examples you can see, that I tend to visualize my algorithms. So far I failed to manage that for dsp code.
And secondly, the dsp editor feels unfinished. I always have the impression that it was planned to let it do much more, and that preperations were made in this regard, that now complicate things. But then it was never finished and left in that awkward state.
In a pixel shader I can directly address the current pixel. I don't need to define a stream, as that is the whole purpose of a pixel shader - working on a pixel stream! So you do something like (call it current, this, self, doesn't matter for this example) current.getRed() and I get the value of the red channel of this pixel.
But that isn't all. Any pixel shader is also aware of the context! I can easily access any pixel of the current frame, just by pixel(x, y).getRed(), for example. This is a read access only, of course. Only the current pixel can be altered. But the good thing about this is, that I can alter the current pixel based on its neighbours! This is used for all kinds of useful stuff, like blur algorithms, color correction and interpolations. No latency is introduced, since it is all one frame that is about to be drawn (just like the underlying buffer in Flowstone, from which the current sample is drawn). I can calculate distances, angles, color predictions and much more just from this little feature. And there is much more. All of it is logical.
The DSP code editor however is illogical. Although the last sample is stored in a buffer, I can't access it. Instead it needs to be stored a second time, inside the DSP editor with a float definition. It also introduces latency, if I need access to other samples than the current. Time consuming, illogical and not efficient.
Given the comparisons, what are your thoughts?
Examples:
ellipse pixel shader
https://www.dropbox.com/s/gvvfwe1m56l1e ... x.png?dl=0
visualization of an algorithm that sets an object a relative distance apart from a circular area that encompasses a defined group of objects
https://www.dropbox.com/s/ydpqvv89n2rgs ... e.png?dl=0
application that builds patterns with self-programmed pixel shaders (and cubic interpolation color range)
https://www.dropbox.com/s/3x98ztyi7vp6222/1.png?dl=0
...and example renderings created with above application
https://www.dropbox.com/sh/qrvcif6mfjv6 ... 4HnBa?dl=0
C-DEX, automatic realtime color palette animation
https://www.dropbox.com/s/7n88imi66jgvs ... n.png?dl=0
a pixel shader applied to over 58,000 objects, running at 30 fps - 8 years ago!
https://www.dropbox.com/s/u6uyqiofbdkqs ... 3.png?dl=0
top down 2D water shader
https://www.dropbox.com/s/dqylddzub1xkqhw/Water.fx?dl=0
the hardware of that time (from 10 years ago)
https://www.dropbox.com/s/5p7hmy0z5yva2 ... o.png?dl=0