Files
5g-iana-uc6-aio/5g-uulm-network-monitoring/README.org
Tuan-Dat Tran 931652c494 Initial commit
Signed-off-by: Tuan-Dat Tran <tuan-dat.tran@tudattr.dev>
2024-12-31 13:36:22 +01:00

58 lines
5.2 KiB
Org Mode

* 5G-IANA: UULM Network Monitoring
This repository contains the CI/CD and Dockerfiles necessary to build the UULM Network Monitoring Tool.
For demonstration purposes we need to send a command to =videoprobe= before it starts running, so we can deploy it beforehand.
To do this simply run the following command:
#+begin_src sh
curl -X GET -H "Content-Type: application/json" -d "{\"node_ip\": [\"<obu-node endpoint>\",\"<pqos endpoint>\"], \"stream_ip\": \"<ping target>\", \"stream_url\": \"<stream url>"}" http://<videoprobe ip/port>/demo/start
#+end_src
- node_ip: A list of API endpoints =videorprobe= should send the collected data to i.e. _[http://192.168.0.149:8001/upload, http://192.168.0.149:8002/upload]_.
- stream_ip: The IP =videoprobe= measures the latency to. Usually this is the same as IP as the ~stream_url~ i.e. _192.168.0.149_.
- stream_url: The full path to the nginx-proxy thats hosting a rtmp stream i.e. _rtmp://192.168.0.149/live/test_.
** Testing Locally
When testing locally we may host the videostream provider and the consumer on the same device.
This is not the case for the deployment on the 5G-IANA platform, where we put them on different clusters (see [[file:maestro-compose.yml]]).
All files regarding local testing can be found in [[file:local/]].
1. Make sure to have the GNSS Dongle connected as a device at ~/dev/ttyACM0~.
If it has another name, change the entry in [[file:local-docker-compose.yml][local-docker-compose.yml]] accordingly.
2. Run ~docker compose -f local-docker-compose.yml up --build~ to build/run all of the =*Dockerfiles=.
3. For the current version, which is built for the demonstration, we need to run the ~curl~ command to provide =videoprobe= with the endpoint to which it'll send the data.
Usually that would be the =obu-node= container.
For local testing we are using [[file:app.py]].
Adjust the port accordingly in the curl command so it looks roughly like this:
#+BEGIN_SRC sh
# Another example: curl -X GET -H "Content-Type: application/json" -d "{\"node_ip\": [\"https://webhook.site/30ffd7cd-0fa5-4391-8725-c05a1bf48a75/upload/\"], \"stream_ip\": \"192.168.30.248\", \"stream_url\": \"rtmp://192.168.30.248:32731/live/test\"}" http://192.168.30.248:31234/demo/start
curl -X GET -H "Content-Type: application/json" -d "{\"node_ip\": [\"http://192.168.0.149:8001/upload\",\"http://192.168.0.149:8002/upload\"], \"stream_ip\": \"192.168.0.149\", \"stream_url\": \"rtmp://192.168.0.149/live/test\"}" http://192.168.0.149:8000/demo/start
#+END_SRC
Given your devices IP is =192.168.100.2=
4. Once running you can do either of the following:
1. Simulate DMLOs ~get_data_stats~ by running the following command:
~curl -X GET -H "Content-Type: application/json" -d "{\"id\": 1}" http://<IP of videoprobe>:8000/data_collection/get_data_stats~
** Running on 5G-IANA
When testing locally we are hosting the videostream provider and the consumer on the same device.
This is not the case for the deployment on the 5G-IANA platform, where we put them on different clusters (see [[file:maestro-compose.yml]]).
1. Make sure OBUs are connected by running the following command on the MEC:
~kubectl get nodes # UULM-OBU1 and UULM-OBU2 should be present~
2. Make sure the OBUs each have a GNSS receiver connected to them.
If there are no devices called ~/dev/ttyACM0~ on each OBU, change the entries in the [[file:docker-compose.yml][docker-compose.yml]]/[[file:maestro-compose.yml][maestro-compose.yml]] accordingly to the actual name of the GNSS receivers and redeploy the images.
A possibly easier alternative would be to unplug the GNSS receiver, reboot the machine and plug it back in, if possible.
3. Find out the IPs for the OBUs and run ~curl -X GET -H "Content-Type: application/json" -d "{\"ip\": http://192.168.100.2:32123/upload}" http://192.168.100.2:8000/demo/start~ on each of them. ~192.168.100.2~ being a placeholder for their respective IPs, 32123 being a placeholder for the port the =obu-node= container is listening on for data-uploads and port 8000 being a placeholder for videoprobe listening on for the start command.
** Open internal Ports
- *1935*: RTMP of =web= providing =sender=-stream
- *8000*: Endpoint of =videoprobe=
** Configurations/Environment Variables
- STREAM_URL: The URL of a rtmp based video stream. In this environment it is to be =web=.
- RUST_LOG: The logging level of the network monitoring tool itself.
- ROCKET_CONFIG: Might as well be constant, but specifies the path for the configuration of the API endpoint of =videoprobe=.
- VP_TARGET: The API endpoint to upload the collected data to with with a ~POST~ request. This is variable should not be used during the demonstration.
- CMD: Needed as an alternative to using the ~command:~ keyword, which is usually used to overwrite a containers entrypoint.
- GNSS_ENABLED: Used for choosing whether the videoprobe should be running with "dry gps". Dry GPS means that the tool will be running without GPS capabilities in case the user is sure that there is no GNSS device present or satalite connectivity can't be ensured.
- GNSS_DEV: The path of the mounted GNSS Device. Needed to start gpsd inside of the container. Changes to it should also be applied to the corresponding [[file:local-docker-compose.yml]] and [[file:docker-compose.yml]].