Forum Replies Created

Viewing 8 posts - 1 through 8 (of 8 total)
  • Author
    Posts
  • in reply to: C# Windows Sample Code for Quickstart #1850

    Quadko
    Member

    Here’s the second level C# class I came up with based on the above “better approach” url to sparxeng, in case it’s useful to anyone.

    It has fewer failures than above, but still some. And the failures in both cases have all been port acquisition failures, which I don’t think is a code issue so much as something in windows or running in the Visual Studio Debugger and it not letting the port go. A port acquisition failure was always solved by closing & re-running.

    Still, barring any bugs I introduced when I unwrapped the highly anonymous code the sparxeng blog provided, this is supposed to be a cleaner (async) way to use the serial port from .NET, and it is working just fine for me in my C# “get some random data in various formats” app.

        class SerialPortManager
        {
            SerialPort port;
            byte[] buffer;
            bool stopping = false;
    
            int _BufferSize = 10240;
            public int BufferSize { get { return _BufferSize; } set { _BufferSize = value; buffer = new byte[value]; } }
    
            public bool IsOpen { get { if (port == null) return false; return port.IsOpen; } }
    
            public delegate void RecieveDataHandler(byte[] data);
            public event RecieveDataHandler RecieveData;
    
            public delegate void ClosedHandler();
            public event ClosedHandler Closed;
    
            public delegate void SerialPortErrorHandler(IOException exception);
            public event SerialPortErrorHandler SerialPortError;
    
            public void Open(string Name, int ReadTimeout = 1000)
            {
                stopping = false;
    
                if (buffer == null || buffer.Length != _BufferSize)
                {
                    BufferSize = _BufferSize;
                }
    
                // Initialize the serial port
                port = new SerialPort(Name, 115200);
                port.DtrEnable = true;
                port.ReadTimeout = ReadTimeout;
                port.Open();
    
                port.BaseStream.BeginRead(buffer, 0, buffer.Length, EndRead, null);
            }
    
            void EndRead(IAsyncResult result)
            {
                try
                {
                    int count = port.BaseStream.EndRead(result);
    
                    if (!stopping)
                    {
                        byte[] data = new byte[count];
                        Buffer.BlockCopy(buffer, 0, data, 0, count);
    
                        RecieveData?.Invoke(data);
                    }
                }
                catch (IOException exception)
                {
                    SerialPortError?.Invoke(exception);
                    Debug.WriteLine(exception.ToString());
                }
    
                // Init another async read or close port as appropriate
                if (!stopping)
                {
                    port.BaseStream.BeginRead(buffer, 0, buffer.Length, EndRead, null);
                }
                else
                {
                    port.Close();
                    Closed?.Invoke();
                }
            }
    
            public void InitiateClose()
            {
                stopping = true;
            }
        }
    
    in reply to: Unreachable random numbers? #1827

    Quadko
    Member

    Left it running overnight, posting top results to record them, and now moving on to other usage projects. :)
    TrueRNGPro

    218,209,027,072 bits in 63937.71s
    
    38 1: 1
    36 1: 1
    35 1: 1
    35 0: 1
    34 1: 4
    34 0: 4
    33 1: 9
    33 0: 3
    32 1: 16
    32 0: 13
    31 1: 25
    31 0: 24
    30 1: 55
    30 0: 45
    in reply to: TrueRNGPro: anything bad that would reduce capabilities? #1826

    Quadko
    Member

    Sweet, very good to know. I admit I’m glad it’s even cool to the touch after running it for hours. I get nervous about the thumb drives that get hot to the touch…

    I didn’t know if running p/n junctions at “breaking point” to generate random data instead of stable transistor behavior had some side effect. Glad there’s no issue. :)

    Great project, cheers, mates!

    in reply to: Unreachable random numbers? #1816

    Quadko
    Member

    I got it all plugged in and pulling data from my C# test code in windows. Very cool!

    I’m thankful for the other forums and online, my naive initial use of the COM Port had both the DTR and “only pulling one byte” problems. Fixing those got me running at speed.

    For fun, the first thing I did with it was fun a counter of “runs of bits”, comparing the standard .NET CSPRNG and TrueRNGPro. Both look great, and in a few hours generate runs of 31+ 1 or 0 bits in a row (I don’t check alignment, so that’s not same as generating a specific number, but the RNG doesn’t care!) It’s easy to see the .5 falloff of each byte length.

    Just for fun, here’s the top of the data I’m getting (ran for different bitcounts/time, and TRNG is still running, I plan to run it a few more hours):

    Key:

    [#same bits in a row] [bit 1 or 0]: # times occurred
    So the CSPRNG line 1 says "Algorithm generated 41 1 bits in a row 1 time in 1.4T bits in 3ish hours."

    And if I did my math correctly, it would take lots of years to generate an all 0 or all 1 64 bit number at current bit rattes, but that’s just the math. So for testing purposes, definitely test random data around interesting classes & values, not across full range – exactly what we already know.

    TrueRNGPro

    
    6,687,032,320 bits in 1956.70s
    
    31 1: 2
    31 0: 1
    30 1: 1
    30 0: 2
    29 1: 4
    28 1: 3
    28 0: 7
    27 1: 12
    27 0: 8
    26 1: 28
    26 0: 11
    25 1: 52
    25 0: 48
    24 1: 91
    24 0: 109
    23 1: 201
    23 0: 185
    

    .NET CSPRNG RNGCryptoServiceProvider

    CSPRNG 
    1,444,732,073,984 bits in 10732.15s
    
    41 1: 1
    39 1: 1
    38 1: 1
    38 0: 7
    37 1: 1
    37 0: 1
    36 1: 5
    36 0: 7
    35 1: 11
    35 0: 12
    34 1: 29
    34 0: 22
    33 1: 53
    33 0: 42
    32 1: 85
    32 0: 86
    31 1: 165
    31 0: 158
    30 1: 313
    30 0: 349
    29 1: 684
    29 0: 670
    
    • This reply was modified 7 years, 7 months ago by  Quadko.
    • This reply was modified 7 years, 7 months ago by  Quadko.
    in reply to: Unreachable random numbers? #1783

    Quadko
    Member

    Thanks, and I appreciate the data. Tracking says my device comes tomorrow; I’m looking forward to getting it plugged in.

    in reply to: Unreachable random numbers? #1757

    Quadko
    Member

    * Last paragraph: “27 0s” – what was I thinking? 31 zero bytes is what I meant, of course. :)

    in reply to: LAN TRNG Server? #1756

    Quadko
    Member

    Very cool, thanks for the suggestions and guidance!

    in reply to: Unreachable random numbers? #1754

    Quadko
    Member

    Thanks for the details! Again, I may just be putting my ignorance on display and I appreciate the info and discussion. I think your comment about whitening vs. raw and the bell curve is likely what I have in mind. If you don’t mind clarification discussion:

    I certainly understand for physical processes generating unbiased bits we are absolutely good. And I haven’t gotten to play with TrueRNG yet, so this is purely a theoretical understanding question from looking at PRNG and random.org articles and data in the past, and wondering if it applies to something like TrueRNGPro.

    When articles talk about using sound sources or random.org’s atmospheric samples, they talk about adjusting the A->D 0/1 bit conversion levels range based on a running window analyzing bit deviation from 50%, so input data doesn’t drift toward all zeros or ones because of input amplitude changes. But isn’t an artifact of that reducing the chance to generate “honest” runs of such bits? Similar considerations appear with the selection of PRNG algorithm details. Maybe this doesn’t apply to all physical methods of generating bits?

    (Trivial example of my fear/understanding: a heuristic A->D window or PRNG algorithm forced to produce exactly 50% 0::1 ratio across 4 bits would be bad! That forces the random values to be: 1100, 1010, 1001, 0101, 0110, 0011. Only 6 values out of 16 allowed by algorithm, and neither the “interesting for testing purposes” extremes of 0000 or 1111 included!)

    So I was under the impression that an artifact of heuristic tuning is the longer runs of bits are made even less likely than the purely statistical likelihood of a 50% per coin-flip run. In another example poor quality algorithm, perhaps generating 8 zero bits in a row is a perfect 1 in 2^8 likelihood. But generating 16 zero bits in a row is only half as likely as a perfect 1 in 2^16, and the algorithm will never allow the generation 32 zero bits in a row.

    I also thought this was acceptable and on purpose because of the use in cryptography – all zero bytes are often unwanted, much less 4 or 8 of them in a row.

    Even assuming I’m not just ignorant of the real details, possible I am down some weird thought path and need my thinking corrected? Perhaps the statistical degradation involved don’t kick in until 2^billion or something, and the analysis window makes sure we have appropriate approximately 50% bit distribution over a megabyte, not something short and bad like a window of 64 bits?

    Again, this question in my brain was around randomly testing inputs in a 2^32 or 2^64 input space and wondering if the numbers really were equally possible or if binary numbers with “long” runs of same bits in the larger space were less likely. (ex: is 0×0000001 w/ run of 27 0s really as likely as numbers like 0xa5a5a5a5 with runs of only 2)? Easily solved for testing purposes by randomly/exhaustively testing ranges around interesting numbers like zero, but piqued my curiosity about generating randomness.

    • This reply was modified 7 years, 7 months ago by  Quadko.
Viewing 8 posts - 1 through 8 (of 8 total)