I have made several larger changes to the ngdbf tool since the conference in Florida. I made a src folder and placed all the octave scripts into that and then created a shell script (though it is kind of still an octave script) to be the main wrapper. It now accepts several command line arguments, but I am still working on a help function that will print all those options. I took some of the double and triple nested "for" loops in run_ngdbf and made those their own function. I also incorporated STORM into the tool and made that an option alongside PRISM.
I realized that the biggest run time issue was not file writing or simulation, it was calculating the transition probabilities. I was able to replace one of the "for" loops with matrix operations and that considerably reduced the amount of time. I also found a bug in the energy calculation function and fixed that as well. For the (8,8) trapping set, it still takes about 30 minutes to run on my machine which will still be a problem for calculating the average probabilities.
Next, I need to review how quantizing the channel affects the algorithm, and then implement that. I am also still a little confused on calculating the FER. I remember at the conference we talked about the normcdf(0) being the probability of an error, but I had this thought that may or may not be on the right track. Couldn't the FER be calculated as the sum of 1-P(state=0) for each initial state multiplied by the probability that the channel sample starts in that state? Then somehow we'd have to take into account the multiplicity of the trapping set.
Sorry for the long post. I am also still working on slides for Davey and Mackay's method for constructing and decoding NB-LDPC codes. @
Winstead @
Zinnia Muntaha Mowri is there a meeting time already set for this week? I should be available all week except for Wednesday afternoon.