Conversation
|
NOTE: it's not really ready for review yet. Basic idea is just to get FPS metrics into influx so we can watch them across all versions of everything, so nothing slips by |
| @@ -0,0 +1,85 @@ | |||
| name: HIL FPS Benchmark Influx | |||
There was a problem hiding this comment.
@danilo-pejovic not sure if this is needed, would you rather have a seperate workflow or instrument all the tests directly (in rvc4_test.yaml) ... it is my preference to instrument all tests/gather all data, but up to your choice.
There was a problem hiding this comment.
Tests that are running on our hil are bom-test.yaml ones, not rvc4_test - those seem internal tests to ml team. I would just add influx thing to bom-tests and not create a new workflow.
There was a problem hiding this comment.
We are still creating a new unnecessary workflow. Can we edit this one that is already part of the bom to also pass info to influx?
https://github.com/luxonis/modelconverter/blob/main/.github/workflows/bom-test.yaml
| # Cache device metadata once for the whole run using oakctl. If oakctl is not | ||
| # available, keep the benchmark runnable and record explicit placeholder values | ||
| # for the missing oakctl-derived metadata. | ||
| oakctl_output=$(oakctl list --format json 2>/dev/null || printf '') |
There was a problem hiding this comment.
Oakctl is not part of these tests. Why are we adding more moving parts that can fail?
Hil framework has built in functions to account get all necessary info from camera:
danilo@danilo:~/Downloads$ camera -t slo316-4pro -n test oak4_pro info -v +----------+----------+------------+------------+------------+------------+------------+--------------+----------+----------------+-------------------------------------------------+----------+ | name | model | platform | revision | mxid | username | password | hostname | adb | state | os_version | adb_id | +==========+==========+============+============+============+============+============+==============+==========+================+=================================================+==========+ | oak4_pro | oak4_pro | rvc4 | 9 | 1583798758 | root | oelinux123 | 10.12.143.89 | 5e66d9e6 | setup-complete | 1.30.0+955791f9014cfe160419deae2f64980f180f4d9c | 5e66d9e6 | +----------+----------+------------+------------+------------+------------+------------+--------------+----------+----------------+-------------------------------------------------+----------+ danilo@danilo:~/Downloads$ camera -t slo316-4pro -n test oak4_pro info -v -j [{"name": "oak4_pro", "model": "oak4_pro", "platform": "rvc4", "revision": 9, "mxid": "1583798758", "username": "root", "password": "oelinux123", "hostname": "10.12.143.89", "adb": "5e66d9e6", "state": "setup-complete", "os_version": "1.30.0+955791f9014cfe160419deae2f64980f180f4d9c", "adb_id": "5e66d9e6"}]
Alternatively - camera object in python can give all the necessary info in even more readable/maintainable way. That is probably the way we should supply that info.
There was a problem hiding this comment.
I will sub-in the json version of the above:
(.hil) hil@slo130-fl:~$ camera -t slo316-4pro -n test oak4_pro info -j
[{"name": "oak4_pro", "model": "oak4_pro", "platform": "rvc4", "revision": 9, "mxid": "1583798758", "hostname": "10.12.143.89", "os_version": "1.27.1+9950fe7353d76fec53a99019be9e1d6130e8f69a", "adb_id": "5e66d9e6"}]
There was a problem hiding this comment.
Q: how to determine the camera used if there are multiple cameras?
There was a problem hiding this comment.
For now I will just use: camera -t ${HIL_TESTBED} -n test all info -j and take the first:
(.hil) hil@slo238-4s:~$ camera -t ${HIL_TESTBED} -n test all info -j use_ssh 2026-04-13 07:21:36,832 [INFO] ssh command failed with the following output: 2026-04-13 07:21:36,832 [INFO] 2026-04-13 07:21:36,832 [INFO] Error message (stderr): 2026-04-13 07:21:36,832 [INFO] 2026-04-13 07:21:36,832 [INFO] Exit code: 5 [{"name": "oak4_s", "model": "oak4_s", "platform": "rvc4", "revision": 9, "mxid": "2671960531", "hostname": "10.12.140.235", "os_version": "1.27.1+9950fe7353d76fec53a99019be9e1d6130e8f69a", "adb_id": "9f42e1d3"}]
There was a problem hiding this comment.
As stated in dms via example provided and in first comment here I would prefer camera metadata to be gathered inside the script:
Alternatively - camera object in python can give all the necessary info in even more readable/maintainable way. That is probably the way we should supply that info.
It would be more readable, consistent and maintainable.
| return f'"{escaped}"' | ||
|
|
||
|
|
||
| def _write_fps_result_to_influx( |
There was a problem hiding this comment.
Can this be implemented in hil_framework? We can probably hardcode influx variables there and then create simple function like camera/testbed.log_to_influx(arguments) and it will be pretty similar to this implementation and how sanity checks work for hil already. That way it is more maintainable - if we for example change where we host out database, it will be one change instead of changing every repo.
There was a problem hiding this comment.
agreed, but let's do it in a follow up ticket, just so I can get this work finished and immediately useful.
There was a problem hiding this comment.
I would prefer to do things properly from the start if we know it is something we are going to be building on - it shouldnt take long.
| @@ -0,0 +1,85 @@ | |||
| name: HIL FPS Benchmark Influx | |||
There was a problem hiding this comment.
Tests that are running on our hil are bom-test.yaml ones, not rvc4_test - those seem internal tests to ml team. I would just add influx thing to bom-tests and not create a new workflow.
tests/test_benchmark/conftest.py
Outdated
| default="rvc4", | ||
| help="Target platform to benchmark (default: rvc4).", | ||
| ) | ||
| parser.addoption( |
There was a problem hiding this comment.
Comments above/below adds more info.
Probably should be implemented in more maintainable way to avoid piping info we already have multiple times.
There was a problem hiding this comment.
So if I understand correctly, I should just remove this here, and fail hard if the env is not properly configured. I will make the change, please comment again if you mean something else here.
There was a problem hiding this comment.
we are passing camera mxid, os versions,... as arguments to the script when there is no need to do that and that info is readily available inside the script via hil framework.
|
In the graph max and min values for models dont seem to be consistent? IIRC they are just some expected value with buffer on both sides so it should be always the same for the same model. Are we maybe mixing up models in tables? Also as long as those min and max values are set constants we probably dont need to track them for every test. It just adds clutter |
|
Changes made, ready for review again. |


Adds influx db pushing of fps results, is BEST EFFORT, depends on oakctl installed on running system.