[EdgeX-TSC-Core] CBOR - Re-encode Performance Metrics

Anthony Bonafide <anthonymbonafide@...>
 

Hello all,

    Last week during the Core Working Group call we were discussing a couple of ways to address some issues regarding capturing ID and time information when dealing with CBOR content in core-data. One option was to decode the original content, update the necessary data and re-encode to CBOR before publishing. Before going down that path we wanted to grab some performance data on the re-encoding process. I have created a Github repo which simulates the process of re-encoding similar to how we would want to implement the previously mentioned steps. There are instructions in the README on how to run the benchmark tests, along with the test structure.

     In a nutshell, the benchmark accepts 2 arguments. The first specifies how many iterations to execute, the second specifies the Event payload size. The options for the payload size are small, medium, and large which correspond to 100K,900K,and 12M respectively(to match the results shared by Tobias regarding checksum implementation). The logic for the bench mark works as follows:
  1. Create CBOR bytes of an Event with a payload of the specified size
  2.  Capture system metrics and capture the start time
  3. Decode CBOR to an Event
  4. Change the ID and Pushed fields
  5. Encode the updated Event
  6. Repeat steps 3-5 as per the specified number of iterations
  7. Calculate elapsed time
  8. Display metrics

    A run of the tests on my laptop resulted in the following:

Metric Event with 100K payload Event with 900K payload Event with 12M payload
Average Re-encode time(nano seconds) 4148 4497 4554
GC runs 590 755 186
GC stop the world time(milliseconds) 57 71 21
Memory allocations(Memory at test completion – memory before test) 30000096 30000148 30000046
System memory in bytes (Memory test completion – memory before test) 1179648 1441792 1179648

    There is more information provided by the CLI tool, but those are what I felt might be the most important. Also, there are a few Go Benchmark tests in the repo which benchmark encoding, decoding, and the re-encoding process as listed above. Each of the tests are isolated and provide another way to get performance metrics. The instructions for executing the Go Benchmark tests can also be found in the Go Benchmark Section of the README. Hopefully this helps and if anyone has any ideas or suggestions regarding more and/or better metrics feel free to reach out.


Thank you,

Anthony Bonafide

Join EdgeX-GoLang@lists.edgexfoundry.org to automatically receive all group messages.