From b026529bfa0e565945e2db22008c899a1800358c Mon Sep 17 00:00:00 2001 From: Matt Date: Sat, 20 Jan 2024 20:52:06 -0500 Subject: [PATCH] address review --- source/docs/calibration/calibration.rst | 2 +- source/docs/examples/simposeest.rst | 2 +- source/docs/objectDetection/about-object-detection.rst | 2 +- source/docs/programming/photonlib/getting-target-data.rst | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/source/docs/calibration/calibration.rst b/source/docs/calibration/calibration.rst index 10e4f8f2..1fbbff12 100644 --- a/source/docs/calibration/calibration.rst +++ b/source/docs/calibration/calibration.rst @@ -114,7 +114,7 @@ Investigating Calibration Data with mrcal `mrcal `_ is a command-line tool for camera calibration and visualization. PhotonVision has the option to use the mrcal backend during camera calibration to estimate intrinsics. mrcal can also be used post-calibration to inspect snapshots and provide feedback. These steps will closely follow the `mrcal tour `_ -- I'm aggregating commands and notes here, but the mrcal documentation is much more thorough. -Start by `Installing mrcal `_. Note that while mrcal *calibration* using Photon is supported on all platforms, but investigation right now only works on Linux. Some users have also reported luck using `WSL 2 on Windows `ap_ as well. You may also need to install ``feedgnuplot``. On Ubuntu systems, these commands should be run from a standalone terminal and *not* the one `built into vscode `_. +Start by `Installing mrcal `_. Note that while mrcal *calibration* using PhotonVision is supported on all platforms, but investigation right now only works on Linux. Some users have also reported luck using `WSL 2 on Windows `ap_ as well. You may also need to install ``feedgnuplot``. On Ubuntu systems, these commands should be run from a standalone terminal and *not* the one `built into vscode `_. Let's run ``calibrationUtils.py`` as described above, and then cd into the output folder. From here, you can follow the mrcal tour, just replacing the VNL filename and camera imager size as necessary. My camera calibration was at 1280x720, so I've set the XY limits to that below. diff --git a/source/docs/examples/simposeest.rst b/source/docs/examples/simposeest.rst index 9c4492d1..e376a77a 100644 --- a/source/docs/examples/simposeest.rst +++ b/source/docs/examples/simposeest.rst @@ -47,7 +47,7 @@ For both simulation and on-robot code, we create objects to represent the physic :lineno-start: 83 -To incorporate Photon Vision, we need to create a :code:`PhotonCamera`: +To incorporate PhotonVision, we need to create a :code:`PhotonCamera`: .. tab-set:: diff --git a/source/docs/objectDetection/about-object-detection.rst b/source/docs/objectDetection/about-object-detection.rst index c32c5bb9..73850603 100644 --- a/source/docs/objectDetection/about-object-detection.rst +++ b/source/docs/objectDetection/about-object-detection.rst @@ -6,7 +6,7 @@ How does it work? PhotonVision supports object detection using neural network accelerator hardware built into Orange Pi 5/5+ coprocessors. The Neural Processing Unit, or NPU, is `used by PhotonVision `_ to massively accelerate certain math operations like those needed for running ML-based object detection. -For the 2024 season, Photon ships with a **pre-trained NOTE detector** (shown above), as well as a mechanism for swapping in custom models. Future development will focus on enabling lower friction management of multiple custom models. +For the 2024 season, PhotonVision ships with a **pre-trained NOTE detector** (shown above), as well as a mechanism for swapping in custom models. Future development will focus on enabling lower friction management of multiple custom models. .. image:: images/notes-ui.png diff --git a/source/docs/programming/photonlib/getting-target-data.rst b/source/docs/programming/photonlib/getting-target-data.rst index 5342ccc2..c6264b89 100644 --- a/source/docs/programming/photonlib/getting-target-data.rst +++ b/source/docs/programming/photonlib/getting-target-data.rst @@ -6,7 +6,7 @@ Constructing a PhotonCamera What is a PhotonCamera? ^^^^^^^^^^^^^^^^^^^^^^^ -``PhotonCamera`` is a class in PhotonLib that allows a user to interact with one camera that is connected to hardware that is running Photon Vision. Through this class, users can retrieve yaw, pitch, roll, robot-relative pose, latency, and a wealth of other information. +``PhotonCamera`` is a class in PhotonLib that allows a user to interact with one camera that is connected to hardware that is running PhotonVision. Through this class, users can retrieve yaw, pitch, roll, robot-relative pose, latency, and a wealth of other information. The ``PhotonCamera`` class has two constructors: one that takes a ``NetworkTable`` and another that takes in the name of the network table that PhotonVision is broadcasting information over. For ease of use, it is recommended to use the latter. The name of the NetworkTable (for the string constructor) should be the same as the camera's nickname (from the PhotonVision UI).