Hi All,
I was hoping to crowdsource some of your experience today on a common application architecture that is bugging me slightly, but I can't decide whether it is genuinly bad or if it is just me. In the spirit of Steve Watts talk at the European CLA summit, something stinks!
The problem statement it solves is, I have multiple data sources, let's say they are the same rate but I have seen different rates as well, that I want to log to disk as a single file. I have seen the same solution several times involving FGVs, this is what makes me nervous!
What they do is have a seperate DAQ process per source of data and a single write to file process. Then either a)They have an FGV per DAQ process or b)A single FGV that different process write into different points in an array. Then in the log to file they either a)read from all FGVs or b)read from the FGV. I don't like this because:
There is no buffering, the only thing that guarantees getting the data is that the various loops stay in phase.
FGVs make the code pretty hard to read.
But it is hard to rip it to shreds when fundamentally people are succeeding with this. What I am interested in is what you guys have found to be a good solution to this problem. Ideally the data should be buffered which points to using queues, but another smell to me is waiting on multiple queues in one loop. One thought I have had is to have a data collator process, this would have to wait on multiple queues but then outputs the single set of data, or is this just b) from above again? I think this could be done in the actor framework as well where the collator then spawns the various DAQ actors.
So what are your thoughts, is there a deodorant to this or is this problem always going to smell!
Cheers,
James