Autonomy Software C++ 24.5.1
Welcome to the Autonomy Software repository of the Mars Rover Design Team (MRDT) at Missouri University of Science and Technology (Missouri S&T)! API reference contains the source code and other resources for the development of the autonomy software for our Mars rover. The Autonomy Software project aims to compete in the University Rover Challenge (URC) by demonstrating advanced autonomous capabilities and robust navigation algorithms.
Loading...
Searching...
No Matches
ZEDCam Class Reference

This class implements and interfaces with the most common ZEDSDK cameras and features. It is designed in such a way that multiple other classes/threads can safely call any method of an object of this class withing resource corruption or slowdown of the camera. More...

#include <ZEDCam.h>

Inheritance diagram for ZEDCam:
Collaboration diagram for ZEDCam:

Public Member Functions

 ZEDCam (const int nPropResolutionX, const int nPropResolutionY, const int nPropFramesPerSecond, const double dPropHorizontalFOV, const double dPropVerticalFOV, const bool bEnableRecordingFlag=false, const bool bExportSVORecordingFlag=false, const float fMinSenseDistance=constants::ZED_DEFAULT_MINIMUM_DISTANCE, const float fMaxSenseDistance=constants::ZED_DEFAULT_MAXIMUM_DISTANCE, const bool bMemTypeGPU=false, const bool bUseHalfDepthPrecision=false, const bool bEnableFusionMaster=false, const int nNumFrameRetrievalThreads=10, const unsigned int unCameraSerialNumber=0)
 Construct a new Zed Cam:: Zed Cam object.
 
 ~ZEDCam ()
 Destroy the Zed Cam:: Zed Cam object.
 
std::future< bool > RequestFrameCopy (cv::Mat &cvFrame) override
 Requests a regular BGRA image from the LEFT eye of the zed camera. Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it. Remember this code will be ran in whatever class/thread calls it.
 
std::future< bool > RequestFrameCopy (cv::cuda::GpuMat &cvGPUFrame) override
 Grabs a regular BGRA image from the LEFT eye of the zed camera and stores it in a GPU mat. Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it. Remember this code will be ran in whatever class/thread calls it.
 
std::future< bool > RequestDepthCopy (cv::Mat &cvDepth, const bool bRetrieveMeasure=true) override
 Requests a depth measure or image from the camera. Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it. This image has the same shape as a grayscale image, but the values represent the depth in MILLIMETERS. The ZEDSDK will always return this measure in MILLIMETERS.
 
std::future< bool > RequestDepthCopy (cv::cuda::GpuMat &cvGPUDepth, const bool bRetrieveMeasure=true) override
 Requests a depth measure or image from the camera. Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it. This image has the same shape as a grayscale image, but the values represent the depth in MILLIMETERS. The ZEDSDK will always return this measure in MILLIMETERS.
 
std::future< bool > RequestPointCloudCopy (cv::Mat &cvPointCloud) override
 Requests a point cloud image from the camera. This image has the same resolution as a normal image but with three XYZ values replacing the old color values in the 3rd dimension. The units and sign of the XYZ values are determined by ZED_MEASURE_UNITS and ZED_COORD_SYSTEM constants set in AutonomyConstants.h.
 
std::future< bool > RequestPointCloudCopy (cv::cuda::GpuMat &cvGPUPointCloud) override
 Grabs a point cloud image from the camera. This image has the same resolution as a normal image but with three XYZ values replacing the old color values in the 3rd dimension. The units and sign of the XYZ values are determined by ZED_MEASURE_UNITS and ZED_COORD_SYSTEM constants set in AutonomyConstants.h.
 
std::future< bool > RequestPositionalPoseCopy (Pose &stPose) override
 Requests the current pose of the camera relative to it's start pose or the origin of the set pose. Puts a Pose pointer into a queue so a copy of a pose from the camera can be written to it. If positional tracking is not enabled, this method will return false and the ZEDCam::Pose may be uninitialized.
 
std::future< bool > RequestFusionGeoPoseCopy (sl::GeoPose &slGeoPose) override
 Requests the current geo pose of the camera. This method should be used to retrieve ONLY the GNSS data from the ZEDSDK's Fusion module. Puts a GeoPose pointer into a queue so a copy of a GeoPose from the camera can be written to it. If positional tracking or fusion is disabled for this camera, then this method will return false and the sl::GeoPose may be uninitialized.
 
std::future< bool > RequestFloorPlaneCopy (sl::Plane &slPlane) override
 Requests the current floor plane of the camera relative to it's current pose. Puts a Plane pointer into a queue so a copy of the floor plane from the camera can be written to it. If positional tracking is not enabled, this method will return false and the sl::Plane may be uninitialized.
 
std::future< bool > RequestSensorsCopy (sl::SensorsData &slSensorsData) override
 Requests the most up to date sensors data from the camera. This data include IMU pose and raw values, barometer, magnetometer, and temperature data.
 
std::future< bool > RequestObjectsCopy (std::vector< sl::ObjectData > &vObjectData) override
 Requests a current copy of the tracked objects from the camera. Puts a pointer to a vector containing sl::ObjectData into a queue so a copy of a frame from the camera can be written to it.
 
std::future< bool > RequestBatchedObjectsCopy (std::vector< sl::ObjectsBatch > &vBatchedObjectData) override
 If batching is enabled, this requests the normal objects and passes them to the the internal batching queue of the zed api. This performs short-term re-identification with deep learning and trajectories filtering. Batching must have been set to enabled when EnableObjectDetection() was called. Most of the time the vector will be empty and will be filled every ZED_OBJDETECTION_BATCH_LATENCY.
 
sl::ERROR_CODE ResetPositionalTracking () override
 Resets the cameras X,Y,Z translation and Roll,Pitch,Yaw orientation back to 0. THINK CAREFULLY! Do you actually want to reset this? It will also realign the coordinate system to whichever way the camera happens to be facing.
 
sl::ERROR_CODE TrackCustomBoxObjects (std::vector< ZedObjectData > &vCustomObjects) override
 A vector containing CustomBoxObjectData objects. These objects simply store information about your detected objects from an external object detection model. You will need to take your inference results and package them into a sl::CustomBoxObjectData so the the ZEDSDK can properly interpret your detections.
 
sl::ERROR_CODE RebootCamera () override
 Performs a hardware reset of the ZED2 or ZED2i camera.
 
sl::FUSION_ERROR_CODE SubscribeFusionToCameraUUID (sl::CameraIdentifier &slCameraUUID) override
 Give a UUID for another ZEDCam, subscribe that camera to this camera's Fusion instance. This will tell this camera's Fusion instance to start ingesting and fusing data from the other camera.
 
sl::CameraIdentifier PublishCameraToFusion () override
 Signal this camera to make its data available to the Fusion module and retrieve a UUID for this class's sl::Camera instance that can be used to subscribe an sl::Fusion instance to this camera later.
 
sl::FUSION_ERROR_CODE IngestGPSDataToFusion (geoops::GPSCoordinate stNewGPSLocation) override
 If this camera is the fusion instance master, this method can be used to ingest/process/fuse the current GNSS position of the camera with the ZEDSDK positional tracking. This allows the use of RequestGeoPose to get a high resolution and highly accurate GNSS/VIO camera pose.
 
sl::ERROR_CODE EnablePositionalTracking (const float fExpectedCameraHeightFromFloorTolerance=constants::ZED_DEFAULT_FLOOR_PLANE_ERROR) override
 Enable the positional tracking functionality of the camera.
 
void DisablePositionalTracking () override
 Disable to positional tracking functionality of the camera.
 
void SetPositionalPose (const double dX, const double dY, const double dZ, const double dXO, const double dYO, const double dZO) override
 Sets the pose of the positional tracking of the camera. XYZ will point in their respective directions according to ZED_COORD_SYSTEM defined in AutonomyConstants.h.
 
sl::ERROR_CODE EnableSpatialMapping () override
 Enabled the spatial mapping feature of the camera. Pose tracking will be enabled if it is not already.
 
void DisableSpatialMapping () override
 Disabled the spatial mapping feature of the camera.
 
sl::ERROR_CODE EnableObjectDetection (const bool bEnableBatching=false) override
 Enables the object detection and tracking feature of the camera.
 
void DisableObjectDetection () override
 Disables the object detection and tracking feature of the camera.
 
bool GetCameraIsOpen () override
 Accessor for the current status of the camera.
 
bool GetUsingGPUMem () const override
 Accessor for if this ZED is storing it's frames in GPU memory.
 
std::string GetCameraModel () override
 Accessor for the model enum from the ZEDSDK and represents the camera model as a string.
 
unsigned int GetCameraSerial () override
 Accessor for the camera's serial number.
 
bool GetPositionalTrackingEnabled () override
 Accessor for if the positional tracking functionality of the camera has been enabled and functioning.
 
sl::PositionalTrackingStatus GetPositionalTrackingState () override
 Accessor for the current positional tracking status of the camera.
 
sl::FusedPositionalTrackingStatus GetFusedPositionalTrackingState () override
 Accessor for the current positional tracking status of the fusion instance.
 
sl::SPATIAL_MAPPING_STATE GetSpatialMappingState () override
 Accessor for the current state of the camera's spatial mapping feature.
 
sl::SPATIAL_MAPPING_STATE ExtractSpatialMapAsync (std::future< sl::Mesh > &fuMeshFuture) override
 Retrieve the built spatial map from the camera. Spatial mapping must be enabled. This method takes in an std::future<sl::FusedPointCloud> to eventually store the map in. It returns a enum code representing the successful scheduling of building the map. Any code other than SPATIAL_MAPPING_STATE::OK means the future will never be filled.
 
bool GetObjectDetectionEnabled () override
 Accessor for if the cameras object detection and tracking feature is enabled.
 
- Public Member Functions inherited from ZEDCamera
 ZEDCamera (const int nPropResolutionX, const int nPropResolutionY, const int nPropFramesPerSecond, const double dPropHorizontalFOV, const double dPropVerticalFOV, const bool bEnableRecordingFlag, const bool bMemTypeGPU, const bool bUseHalfDepthPrecision, const bool bEnableFusionMaster, const int nNumFrameRetrievalThreads, const unsigned int unCameraSerialNumber)
 Construct a new ZEDCamera object.
 
virtual ~ZEDCamera ()=default
 Destroy the ZEDCamera object.
 
virtual bool GetIsFusionMaster () const
 Accessor for if this ZED is running a fusion instance.
 
- Public Member Functions inherited from Camera< cv::Mat >
 Camera (const int nPropResolutionX, const int nPropResolutionY, const int nPropFramesPerSecond, const PIXEL_FORMATS ePropPixelFormat, const double dPropHorizontalFOV, const double dPropVerticalFOV, const bool bEnableRecordingFlag, const int nNumFrameRetrievalThreads=5)
 Construct a new Camera object.
 
virtual ~Camera ()
 Destroy the Camera object.
 
cv::Size GetPropResolution () const
 Accessor for the Prop Resolution private member.
 
int GetPropFramesPerSecond () const
 Accessor for the Prop Frames Per Second private member.
 
PIXEL_FORMATS GetPropPixelFormat () const
 Accessor for the Prop Pixel Format private member.
 
double GetPropHorizontalFOV () const
 Accessor for the Prop Horizontal F O V private member.
 
double GetPropVerticalFOV () const
 Accessor for the Prop Vertical F O V private member.
 
bool GetEnableRecordingFlag () const
 Accessor for the Enable Recording Flag private member.
 
void SetEnableRecordingFlag (const bool bEnableRecordingFlag)
 Mutator for the Enable Recording Flag private member.
 
- Public Member Functions inherited from AutonomyThread< void >
 AutonomyThread ()
 Construct a new Autonomy Thread object.
 
virtual ~AutonomyThread ()
 Destroy the Autonomy Thread object. If the parent object or main thread is destroyed or exited while this thread is still running, a race condition will occur. Stopping and joining the thread here insures that the main program can't exit if the user forgot to stop and join the thread.
 
void Start ()
 When this method is called, it starts a new thread that runs the code within the ThreadedContinuousCode method. This is the users main code that will run the important and continuous code for the class.
 
void RequestStop ()
 Signals threads to stop executing user code, terminate. DOES NOT JOIN. This method will not force the thread to exit, if the user code is not written properly and contains WHILE statement or any other long-executing or blocking code, then the thread will not exit until the next iteration.
 
void Join ()
 Waits for thread to finish executing and then closes thread. This method will block the calling code until thread is finished.
 
bool Joinable () const
 Check if the code within the thread and all pools created by it are finished executing and the thread is ready to be closed.
 
AutonomyThreadState GetThreadState () const
 Accessor for the Threads State private member.
 
IPSGetIPS ()
 Accessor for the Frame I P S private member.
 

Private Member Functions

void ThreadedContinuousCode () override
 The code inside this private method runs in a separate thread, but still has access to this*. This method continuously calls the grab() function of the ZEDSDK, which updates all frames (RGB, depth, cloud) and all other data such as positional and spatial mapping. Then a thread pool is started and joined once per iteration to mass copy the frames and/or measure to any other thread waiting in the queues.
 
void PooledLinearCode () override
 This method holds the code that is ran in the thread pool started by the ThreadedLinearCode() method. It copies the data from the different data objects to references of the same type stored in a queue filled by the Request methods.
 

Private Attributes

sl::Camera m_slCamera
 
std::shared_mutex m_muCameraMutex
 
sl::InitParameters m_slCameraParams
 
sl::RuntimeParameters m_slRuntimeParams
 
sl::RecordingParameters m_slRecordingParams
 
sl::Fusion m_slFusionInstance
 
std::shared_mutex m_muFusionMutex
 
sl::InitFusionParameters m_slFusionParams
 
sl::MEASURE m_slDepthMeasureType
 
sl::PositionalTrackingParameters m_slPoseTrackingParams
 
sl::PositionalTrackingFusionParameters m_slFusionPoseTrackingParams
 
sl::Pose m_slCameraPose
 
sl::GeoPose m_slFusionGeoPose
 
sl::Plane m_slFloorPlane
 
sl::Transform m_slFloorTrackingTransform
 
sl::SensorsData m_slSensorsData
 
sl::SpatialMappingParameters m_slSpatialMappingParams
 
sl::ObjectDetectionParameters m_slObjectDetectionParams
 
sl::BatchParameters m_slObjectDetectionBatchParams
 
sl::Objects m_slDetectedObjects
 
std::vector< sl::ObjectsBatch > m_slDetectedObjectsBatched
 
sl::MEM m_slMemoryType
 
sl::MODEL m_slCameraModel
 
float m_fExpectedCameraHeightFromFloorTolerance
 
double m_dPoseOffsetX
 
double m_dPoseOffsetY
 
double m_dPoseOffsetZ
 
double m_dPoseOffsetXO
 
double m_dPoseOffsetYO
 
double m_dPoseOffsetZO
 
geoops::GPSCoordinate m_stCurrentGPSBasedPosition
 
sl::Mat m_slFrame
 
sl::Mat m_slDepthImage
 
sl::Mat m_slDepthMeasure
 
sl::Mat m_slPointCloud
 
std::queue< containers::FrameFetchContainer< cv::cuda::GpuMat > > m_qGPUFrameCopySchedule
 
std::queue< containers::DataFetchContainer< std::vector< ZedObjectData > > > m_qCustomBoxIngestSchedule
 
std::queue< containers::DataFetchContainer< Pose > > m_qPoseCopySchedule
 
std::queue< containers::DataFetchContainer< sl::GeoPose > > m_qGeoPoseCopySchedule
 
std::queue< containers::DataFetchContainer< sl::Plane > > m_qFloorCopySchedule
 
std::queue< containers::DataFetchContainer< sl::SensorsData > > m_qSensorsCopySchedule
 
std::queue< containers::DataFetchContainer< std::vector< sl::ObjectData > > > m_qObjectDataCopySchedule
 
std::queue< containers::DataFetchContainer< std::vector< sl::ObjectsBatch > > > m_qObjectBatchedDataCopySchedule
 
std::shared_mutex m_muCustomBoxIngestMutex
 
std::shared_mutex m_muPoseCopyMutex
 
std::shared_mutex m_muGeoPoseCopyMutex
 
std::shared_mutex m_muFloorCopyMutex
 
std::shared_mutex m_muSensorsCopyMutex
 
std::shared_mutex m_muObjectDataCopyMutex
 
std::shared_mutex m_muObjectBatchedDataCopyMutex
 
std::atomic< bool > m_bNormalFramesQueued
 
std::atomic< bool > m_bDepthFramesQueued
 
std::atomic< bool > m_bPointCloudsQueued
 
std::atomic< bool > m_bPosesQueued
 
std::atomic< bool > m_bGeoPosesQueued
 
std::atomic< bool > m_bFloorsQueued
 
std::atomic< bool > m_bSensorsQueued
 
std::atomic< bool > m_bObjectsQueued
 
std::atomic< bool > m_bBatchedObjectsQueued
 

Additional Inherited Members

- Public Types inherited from AutonomyThread< void >
enum  AutonomyThreadState
 
- Protected Member Functions inherited from AutonomyThread< void >
void RunPool (const unsigned int nNumTasksToQueue, const unsigned int nNumThreads=2, const bool bForceStopCurrentThreads=false)
 When this method is called, it starts/adds tasks to a thread pool that runs nNumTasksToQueue copies of the code within the PooledLinearCode() method using nNumThreads number of threads. This is meant to be used as an internal utility of the child class to further improve parallelization. Default value for nNumThreads is 2.
 
void RunDetachedPool (const unsigned int nNumTasksToQueue, const unsigned int nNumThreads=2, const bool bForceStopCurrentThreads=false)
 When this method is called, it starts a thread pool full of threads that don't return std::futures (like a placeholder for the thread return type). This means the thread will not have a return type and there is no way to determine if the thread has finished other than calling the Join() method. Only use this if you want to 'set and forget'. It will be faster as it doesn't return futures. Runs PooledLinearCode() method code. This is meant to be used as an internal utility of the child class to further improve parallelization.
 
void ParallelizeLoop (const int nNumThreads, const N tTotalIterations, F &&tLoopFunction)
 Given a ref-qualified looping function and an arbitrary number of iterations, this method will divide up the loop and run each section in a thread pool. This function must not return anything. This method will block until the loop has completed.
 
void ClearPoolQueue ()
 Clears any tasks waiting to be ran in the queue, tasks currently running will remain running.
 
void JoinPool ()
 Waits for pool to finish executing tasks. This method will block the calling code until thread is finished.
 
bool PoolJoinable () const
 Check if the internal pool threads are done executing code and the queue is empty.
 
void SetMainThreadIPSLimit (int nMaxIterationsPerSecond=0)
 Mutator for the Main Thread Max I P S private member.
 
int GetPoolNumOfThreads ()
 Accessor for the Pool Num Of Threads private member.
 
int GetPoolQueueLength ()
 Accessor for the Pool Queue Size private member.
 
std::vector< void > GetPoolResults ()
 Accessor for the Pool Results private member. The action of getting results will destroy and remove them from this object. This method blocks if the thread is not finished, so no need to call JoinPool() before getting results.
 
int GetMainThreadMaxIPS () const
 Accessor for the Main Thread Max I P S private member.
 
- Protected Attributes inherited from ZEDCamera
const std::memory_order ATOMIC_MEMORY_ORDER_METHOD = std::memory_order_relaxed
 
unsigned int m_unCameraSerialNumber
 
bool m_bCameraIsFusionMaster
 
- Protected Attributes inherited from Camera< cv::Mat >
int m_nPropResolutionX
 
int m_nPropResolutionY
 
int m_nPropFramesPerSecond
 
int m_nNumFrameRetrievalThreads
 
PIXEL_FORMATS m_ePropPixelFormat
 
double m_dPropHorizontalFOV
 
double m_dPropVerticalFOV
 
std::atomic_bool m_bEnableRecordingFlag
 
std::queue< containers::FrameFetchContainer< cv::Mat > > m_qFrameCopySchedule
 
std::shared_mutex m_muPoolScheduleMutex
 
std::shared_mutex m_muFrameCopyMutex
 
- Protected Attributes inherited from AutonomyThread< void >
IPS m_IPS
 

Detailed Description

This class implements and interfaces with the most common ZEDSDK cameras and features. It is designed in such a way that multiple other classes/threads can safely call any method of an object of this class withing resource corruption or slowdown of the camera.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-09-21

Constructor & Destructor Documentation

◆ ZEDCam()

ZEDCam::ZEDCam ( const int  nPropResolutionX,
const int  nPropResolutionY,
const int  nPropFramesPerSecond,
const double  dPropHorizontalFOV,
const double  dPropVerticalFOV,
const bool  bEnableRecordingFlag = false,
const bool  bExportSVORecordingFlag = false,
const float  fMinSenseDistance = constants::ZED_DEFAULT_MINIMUM_DISTANCE,
const float  fMaxSenseDistance = constants::ZED_DEFAULT_MAXIMUM_DISTANCE,
const bool  bMemTypeGPU = false,
const bool  bUseHalfDepthPrecision = false,
const bool  bEnableFusionMaster = false,
const int  nNumFrameRetrievalThreads = 10,
const unsigned int  unCameraSerialNumber = 0 
)

Construct a new Zed Cam:: Zed Cam object.

Parameters
nPropResolutionX- X res of camera. Must be smaller than ZED_BASE_RESOLUTION.
nPropResolutionY- Y res of camera. Must be smaller than ZED_BASE_RESOLUTION.
nPropFramesPerSecond- FPS camera is running at.
dPropHorizontalFOV- The horizontal field of view.
dPropVerticalFOV- The vertical field of view.
bEnableRecordingFlag- Whether or not this camera should be recorded.
fMinSenseDistance- The minimum distance to include in depth measures.
fMaxSenseDistance- The maximum distance to include in depth measures.
bMemTypeGPU- Whether or not to use the GPU memory for operations.
bUseHalfPrecision- Whether or not to use a float16 instead of float32 for depth measurements.
bEnableFusionMaster- Enables ZEDSDK Fusion integration for this camera. This camera will serve as the master instance for all fusion functions.
nNumFrameRetrievalThreads- The number of threads to use for copying frames/data to requests.
unCameraSerialNumber- The serial number of the camera to open.
Note
Do not set bEnableFusionMaster to true if you want to subscribe the camera to another camera! Only one camera should have Fusion enabled! To subscribe a camera to the camera running the master fusion instance, use the GetFusionInstance() and GetCameraSerial() functions. Refer to Fusion documentation for info on working with fusion functions: https://www.stereolabs.com/docs/api/classsl_1_1Fusion.html
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26
53 :
54 ZEDCamera(nPropResolutionX,
55 nPropResolutionY,
56 nPropFramesPerSecond,
57 dPropHorizontalFOV,
58 dPropVerticalFOV,
59 bEnableRecordingFlag,
60 bMemTypeGPU,
61 bUseHalfDepthPrecision,
62 bEnableFusionMaster,
63 nNumFrameRetrievalThreads,
64 unCameraSerialNumber)
65{
66 // Assign member variables.
67 bMemTypeGPU ? m_slMemoryType = sl::MEM::GPU : m_slMemoryType = sl::MEM::CPU;
68 bUseHalfDepthPrecision ? m_slDepthMeasureType = sl::MEASURE::DEPTH_U16_MM : m_slDepthMeasureType = sl::MEASURE::DEPTH;
69 m_dPoseOffsetX = 0.0;
70 m_dPoseOffsetY = 0.0;
71 m_dPoseOffsetZ = 0.0;
72 m_dPoseOffsetXO = 0.0;
73 m_dPoseOffsetYO = 0.0;
74 m_dPoseOffsetZO = 0.0;
75 // Initialize queued toggles.
76 m_bNormalFramesQueued = false;
77 m_bDepthFramesQueued = false;
78 m_bPointCloudsQueued = false;
79 m_bPosesQueued = false;
80 m_bGeoPosesQueued = false;
81 m_bFloorsQueued = false;
82 m_bObjectsQueued = false;
83 m_bBatchedObjectsQueued = false;
84
85 // Setup camera params.
86 m_slCameraParams.camera_resolution = constants::ZED_BASE_RESOLUTION;
87 m_slCameraParams.camera_fps = nPropFramesPerSecond;
88 m_slCameraParams.coordinate_units = constants::ZED_MEASURE_UNITS;
89 m_slCameraParams.coordinate_system = constants::ZED_COORD_SYSTEM;
90 m_slCameraParams.sdk_verbose = constants::ZED_SDK_VERBOSE;
91 m_slCameraParams.depth_mode = constants::ZED_DEPTH_MODE;
92 m_slCameraParams.depth_minimum_distance = fMinSenseDistance;
93 m_slCameraParams.depth_maximum_distance = fMaxSenseDistance;
94 m_slCameraParams.depth_stabilization = constants::ZED_DEPTH_STABILIZATION;
95 // Only set serial number if necessary.
96 if (unCameraSerialNumber != static_cast<unsigned int>(0))
97 {
98 m_slCameraParams.input.setFromSerialNumber(unCameraSerialNumber);
99 }
100
101 // Setup camera runtime params.
102 m_slRuntimeParams.enable_fill_mode = constants::ZED_SENSING_FILL;
103 // Setup SVO recording parameters.
104 m_slRecordingParams.compression_mode = constants::ZED_SVO_COMPRESSION;
105 m_slRecordingParams.bitrate = constants::ZED_SVO_BITRATE;
106
107 // Setup positional tracking parameters.
108 m_slPoseTrackingParams.mode = constants::ZED_POSETRACK_MODE;
109 m_slPoseTrackingParams.enable_area_memory = constants::ZED_POSETRACK_AREA_MEMORY;
110 m_slPoseTrackingParams.enable_pose_smoothing = constants::ZED_POSETRACK_POSE_SMOOTHING;
111 m_slPoseTrackingParams.set_floor_as_origin = constants::ZED_POSETRACK_FLOOR_IS_ORIGIN;
112 m_slPoseTrackingParams.enable_imu_fusion = constants::ZED_POSETRACK_ENABLE_IMU_FUSION;
113 m_slPoseTrackingParams.depth_min_range = constants::ZED_POSETRACK_USABLE_DEPTH_MIN;
114 m_slPoseTrackingParams.set_gravity_as_origin = constants::ZED_POSETRACK_USE_GRAVITY_ORIGIN;
115
116 // Setup spatial mapping parameters.
117 m_slSpatialMappingParams.map_type = constants::ZED_MAPPING_TYPE;
118 m_slSpatialMappingParams.resolution_meter = constants::ZED_MAPPING_RESOLUTION_METER;
119 m_slSpatialMappingParams.save_texture = true;
120 m_slSpatialMappingParams.use_chunk_only = constants::ZED_MAPPING_USE_CHUNK_ONLY;
121 m_slSpatialMappingParams.stability_counter = constants::ZED_MAPPING_STABILITY_COUNTER;
122 // Set or auto-set max depth range for mapping.
123 if (constants::ZED_MAPPING_RANGE_METER <= 0)
124 {
125 // Automatically guess the best mapping depth range.
126 m_slSpatialMappingParams.range_meter = m_slSpatialMappingParams.getRecommendedRange(constants::ZED_MAPPING_RESOLUTION_METER, m_slCamera);
127 }
128 else
129 {
130 // Manually set.
131 m_slSpatialMappingParams.range_meter = constants::ZED_MAPPING_RANGE_METER;
132 }
133
134 // Setup object detection/tracking parameters.
135 m_slObjectDetectionParams.detection_model = sl::OBJECT_DETECTION_MODEL::CUSTOM_BOX_OBJECTS;
136 m_slObjectDetectionParams.enable_tracking = constants::ZED_OBJDETECTION_TRACK_OBJ;
137 m_slObjectDetectionParams.enable_segmentation = constants::ZED_OBJDETECTION_SEGMENTATION;
138 m_slObjectDetectionParams.filtering_mode = constants::ZED_OBJDETECTION_FILTERING;
139 m_slObjectDetectionParams.prediction_timeout_s = constants::ZED_OBJDETECTION_TRACKING_PREDICTION_TIMEOUT;
140 // Setup object detection/tracking batch parameters.
141 m_slObjectDetectionBatchParams.enable = false;
142 m_slObjectDetectionBatchParams.id_retention_time = constants::ZED_OBJDETECTION_BATCH_RETENTION_TIME;
143 m_slObjectDetectionBatchParams.latency = constants::ZED_OBJDETECTION_BATCH_LATENCY;
144 m_slObjectDetectionParams.batch_parameters = m_slObjectDetectionBatchParams;
145
146 // Attempt to open camera.
147 sl::ERROR_CODE slReturnCode = m_slCamera.open(m_slCameraParams);
148 // Check if the camera was successfully opened.
149 if (m_slCamera.isOpened())
150 {
151 // Update camera serial number if camera was opened with autodetect.
152 m_unCameraSerialNumber = m_slCamera.getCameraInformation().serial_number;
153 // Update camera model.
154 m_slCameraModel = m_slCamera.getCameraInformation().camera_model;
155 // Check if the camera should record and output an SVO file.
156 if (bExportSVORecordingFlag)
157 {
158 // Now that camera is opened get camera name and construct path.
159 std::string szSVOFilePath = constants::LOGGING_OUTPUT_PATH_ABSOLUTE + "/" + logging::g_szProgramStartTimeString + "/" + this->GetCameraModel() + "_" +
160 std::to_string(this->GetCameraSerial());
161 m_slRecordingParams.video_filename = szSVOFilePath.c_str();
162 // Enable recording.
163 sl::ERROR_CODE slReturnCode = m_slCamera.enableRecording(m_slRecordingParams);
164 // Check if recording was enabled successfully.
165 if (slReturnCode == sl::ERROR_CODE::SUCCESS)
166 {
167 // Submit logger message.
168 LOG_DEBUG(logging::g_qSharedLogger,
169 "Successfully enabled SVO recording for {} ZED stereo camera with serial number {}.",
170 this->GetCameraModel(),
171 m_unCameraSerialNumber);
172 }
173 else
174 {
175 // Submit logger message.
176 LOG_ERROR(logging::g_qSharedLogger,
177 "Failed to enable SVO recording for {} ZED stereo camera with serial number {}. sl::ERROR_CODE is {}",
178 this->GetCameraModel(),
179 m_unCameraSerialNumber,
180 sl::toString(slReturnCode).c_str());
181 }
182 }
183
184 // Submit logger message.
185 LOG_INFO(logging::g_qSharedLogger, "{} ZED stereo camera with serial number {} has been successfully opened.", this->GetCameraModel(), m_unCameraSerialNumber);
186 }
187 else
188 {
189 // Submit logger message.
190 LOG_ERROR(logging::g_qSharedLogger,
191 "Unable to open ZED stereo camera {} ({})! sl::ERROR_CODE is: {}",
192 sl::toString(m_slCameraModel).get(),
193 m_unCameraSerialNumber,
194 sl::toString(slReturnCode).get());
195 }
196
197 // Check if this camera should serve has the master Fusion instance.
198 if (bEnableFusionMaster)
199 {
200 // Setup Fusion params.
201 m_slFusionParams.coordinate_units = constants::FUSION_MEASUREMENT_UNITS;
202 m_slFusionParams.coordinate_system = constants::FUSION_COORD_SYSTEM;
203 m_slFusionParams.verbose = constants::FUSION_SDK_VERBOSE;
204 // Setup Fusion positional tracking parameters.
205 m_slFusionPoseTrackingParams.enable_GNSS_fusion = constants::FUSION_ENABLE_GNSS_FUSION;
206
207 // Initialize fusion instance for camera.
208 sl::FUSION_ERROR_CODE slReturnCode = m_slFusionInstance.init(m_slFusionParams);
209 // Check if fusion initialized properly.
210 if (slReturnCode == sl::FUSION_ERROR_CODE::SUCCESS)
211 {
212 // Enable odometry publishing for this ZED camera.
213 m_slCamera.startPublishing();
214 // Subscribe this camera to fusion instance.
215 slReturnCode = m_slFusionInstance.subscribe(sl::CameraIdentifier(m_unCameraSerialNumber));
216
217 // Check if this camera was successfully subscribed to the Fusion instance.
218 if (slReturnCode == sl::FUSION_ERROR_CODE::SUCCESS)
219 {
220 // Submit logger message.
221 LOG_DEBUG(logging::g_qSharedLogger, "Initialized FUSION instance for ZED camera {} ({})!", sl::toString(m_slCameraModel).get(), m_unCameraSerialNumber);
222 }
223 else
224 {
225 // Submit logger message.
226 LOG_DEBUG(logging::g_qSharedLogger,
227 "Unable to subscribe to internal FUSION instance for camera {} ({})! sl::FUSION_ERROR_CODE is: {}",
228 sl::toString(m_slCameraModel).get(),
229 m_unCameraSerialNumber,
230 sl::toString(slReturnCode).get());
231 }
232 }
233 else
234 {
235 // Submit logger message.
236 LOG_ERROR(logging::g_qSharedLogger,
237 "Unable to initialize FUSION instance for camera {} ({})! sl::FUSION_ERROR_CODE is: {}",
238 sl::toString(m_slCameraModel).get(),
239 m_unCameraSerialNumber,
240 sl::toString(slReturnCode).get());
241 }
242 }
243
244 // Set max FPS of the ThreadedContinuousCode method.
245 this->SetMainThreadIPSLimit(nPropFramesPerSecond);
246}
void SetMainThreadIPSLimit(int nMaxIterationsPerSecond=0)
Mutator for the Main Thread Max I P S private member.
Definition AutonomyThread.hpp:467
unsigned int GetCameraSerial() override
Accessor for the camera's serial number.
Definition ZEDCam.cpp:2275
std::string GetCameraModel() override
Accessor for the model enum from the ZEDSDK and represents the camera model as a string.
Definition ZEDCam.cpp:2246
This class serves as a middle inheritor between the Camera interface and the ZEDCam class....
Definition ZEDCamera.hpp:33
__host__ cudaTextureObject_t get() const noexcept
Here is the call graph for this function:

◆ ~ZEDCam()

ZEDCam::~ZEDCam ( )

Destroy the Zed Cam:: Zed Cam object.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26
256{
257 // Stop threaded code.
258 this->RequestStop();
259 this->Join();
260
261 // Check if fusion master instance has been started for this camera.
262 if (m_bCameraIsFusionMaster)
263 {
264 // Close Fusion instance.
265 m_slFusionInstance.close();
266 }
267
268 // Close the ZEDCam.
269 m_slCamera.close();
270
271 // Submit logger message.
272 LOG_INFO(logging::g_qSharedLogger, "ZED stereo camera with serial number {} has been successfully closed.", m_unCameraSerialNumber);
273}
void Join()
Waits for thread to finish executing and then closes thread. This method will block the calling code ...
Definition AutonomyThread.hpp:180
void RequestStop()
Signals threads to stop executing user code, terminate. DOES NOT JOIN. This method will not force the...
Definition AutonomyThread.hpp:164
Here is the call graph for this function:

Member Function Documentation

◆ RequestFrameCopy() [1/2]

std::future< bool > ZEDCam::RequestFrameCopy ( cv::Mat cvFrame)
overridevirtual

Requests a regular BGRA image from the LEFT eye of the zed camera. Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it. Remember this code will be ran in whatever class/thread calls it.

Parameters
cvFrame- A reference to the cv::Mat to copy the normal frame to.
Returns
std::future<bool> - A future that should be waited on before the passed in frame is used. Value will be true if frame was successfully retrieved.
Author
ClayJay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-09-09

Implements ZEDCamera.

1004{
1005 // Assemble the FrameFetchContainer.
1006 containers::FrameFetchContainer<cv::Mat> stContainer(cvFrame, PIXEL_FORMATS::eBGRA);
1007
1008 // Acquire lock on frame copy queue.
1009 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1010 // Append frame fetch container to the schedule queue.
1011 m_qFrameCopySchedule.push(stContainer);
1012 // Release lock on the frame schedule queue.
1013 lkSchedulers.unlock();
1014
1015 // Check if frame queue toggle has already been set.
1016 if (!m_bNormalFramesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1017 {
1018 // Signify that the frame queue is not empty.
1019 m_bNormalFramesQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1020 }
1021
1022 // Return the future from the promise stored in the container.
1023 return stContainer.pCopiedFrameStatus->get_future();
1024}
This struct is used to carry references to camera frames for scheduling and copying....
Definition FetchContainers.hpp:89

◆ RequestFrameCopy() [2/2]

std::future< bool > ZEDCam::RequestFrameCopy ( cv::cuda::GpuMat cvGPUFrame)
overridevirtual

Grabs a regular BGRA image from the LEFT eye of the zed camera and stores it in a GPU mat. Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it. Remember this code will be ran in whatever class/thread calls it.

Parameters
cvGPUFrame- A reference to the cv::Mat to copy the normal frame to.
Returns
std::future<bool> - A future that should be waited on before the passed in frame is used. Value will be true if frame was successfully retrieved.
Author
ClayJay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-09-09

Reimplemented from ZEDCamera.

1039{
1040 // Assemble the FrameFetchContainer.
1041 containers::FrameFetchContainer<cv::cuda::GpuMat> stContainer(cvGPUFrame, PIXEL_FORMATS::eBGRA);
1042
1043 // Acquire lock on frame copy queue.
1044 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1045 // Append frame fetch container to the schedule queue.
1046 m_qGPUFrameCopySchedule.push(stContainer);
1047 // Release lock on the frame schedule queue.
1048 lkSchedulers.unlock();
1049
1050 // Check if frame queue toggle has already been set.
1051 if (!m_bNormalFramesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1052 {
1053 // Signify that the frame queue is not empty.
1054 m_bNormalFramesQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1055 }
1056
1057 // Return the future from the promise stored in the container.
1058 return stContainer.pCopiedFrameStatus->get_future();
1059}

◆ RequestDepthCopy() [1/2]

std::future< bool > ZEDCam::RequestDepthCopy ( cv::Mat cvDepth,
const bool  bRetrieveMeasure = true 
)
overridevirtual

Requests a depth measure or image from the camera. Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it. This image has the same shape as a grayscale image, but the values represent the depth in MILLIMETERS. The ZEDSDK will always return this measure in MILLIMETERS.

Parameters
cvDepth- A reference to the cv::Mat to copy the depth frame to.
bRetrieveMeasure- False to get depth IMAGE instead of MEASURE. Do not use the 8-bit grayscale depth image purposes other than displaying depth.
Returns
std::future<bool> - A future that should be waited on before the passed in frame is used. Value will be true if frame was successfully retrieved.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26

Implements ZEDCamera.

1077{
1078 // Create instance variables.
1079 PIXEL_FORMATS eFrameType;
1080
1081 // Check if the container should be set to retrieve an image or a measure.
1082 bRetrieveMeasure ? eFrameType = PIXEL_FORMATS::eDepthMeasure : eFrameType = PIXEL_FORMATS::eDepthImage;
1083 // Assemble container.
1084 containers::FrameFetchContainer<cv::Mat> stContainer(cvDepth, eFrameType);
1085
1086 // Acquire lock on frame copy queue.
1087 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1088 // Append frame fetch container to the schedule queue.
1089 m_qFrameCopySchedule.push(stContainer);
1090 // Release lock on the frame schedule queue.
1091 lkSchedulers.unlock();
1092
1093 // Check if frame queue toggle has already been set.
1094 if (!m_bDepthFramesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1095 {
1096 // Signify that the frame queue is not empty.
1097 m_bDepthFramesQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1098 }
1099
1100 // Return the future from the promise stored in the container.
1101 return stContainer.pCopiedFrameStatus->get_future();
1102}

◆ RequestDepthCopy() [2/2]

std::future< bool > ZEDCam::RequestDepthCopy ( cv::cuda::GpuMat cvGPUDepth,
const bool  bRetrieveMeasure = true 
)
overridevirtual

Requests a depth measure or image from the camera. Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it. This image has the same shape as a grayscale image, but the values represent the depth in MILLIMETERS. The ZEDSDK will always return this measure in MILLIMETERS.

Parameters
cvGPUDepth- A reference to the cv::Mat to copy the depth frame to.
bRetrieveMeasure- False to get depth IMAGE instead of MEASURE. Do not use the 8-bit grayscale depth image purposes other than displaying depth.
Returns
std::future<bool> - A future that should be waited on before the passed in frame is used. Value will be true if frame was successfully retrieved.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26

Reimplemented from ZEDCamera.

1120{
1121 // Create instance variables.
1122 PIXEL_FORMATS eFrameType;
1123
1124 // Check if the container should be set to retrieve an image or a measure.
1125 bRetrieveMeasure ? eFrameType = PIXEL_FORMATS::eDepthMeasure : eFrameType = PIXEL_FORMATS::eDepthImage;
1126 // Assemble container.
1127 containers::FrameFetchContainer<cv::cuda::GpuMat> stContainer(cvGPUDepth, eFrameType);
1128
1129 // Acquire lock on frame copy queue.
1130 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1131 // Append frame fetch container to the schedule queue.
1132 m_qGPUFrameCopySchedule.push(stContainer);
1133 // Release lock on the frame schedule queue.
1134 lkSchedulers.unlock();
1135
1136 // Check if frame queue toggle has already been set.
1137 if (!m_bDepthFramesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1138 {
1139 // Signify that the frame queue is not empty.
1140 m_bDepthFramesQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1141 }
1142
1143 // Return the future from the promise stored in the container.
1144 return stContainer.pCopiedFrameStatus->get_future();
1145}

◆ RequestPointCloudCopy() [1/2]

std::future< bool > ZEDCam::RequestPointCloudCopy ( cv::Mat cvPointCloud)
overridevirtual

Requests a point cloud image from the camera. This image has the same resolution as a normal image but with three XYZ values replacing the old color values in the 3rd dimension. The units and sign of the XYZ values are determined by ZED_MEASURE_UNITS and ZED_COORD_SYSTEM constants set in AutonomyConstants.h.

A 4th value in the 3rd dimension exists as a float32 storing the BGRA values. Each color value is 8-bits and is in this order: 00000000 00000000 00000000 00000000 = 32 bits (float32) B G R A

Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it.

Parameters
cvPointCloud- A reference to the cv::Mat to copy the point cloud frame to.
Returns
std::future<bool> - A future that should be waited on before the passed in frame is used. Value will be true if frame was successfully retrieved.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26

Implements ZEDCamera.

1168{
1169 // Assemble the FrameFetchContainer.
1170 containers::FrameFetchContainer<cv::Mat> stContainer(cvPointCloud, PIXEL_FORMATS::eXYZBGRA);
1171
1172 // Acquire lock on frame copy queue.
1173 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1174 // Append frame fetch container to the schedule queue.
1175 m_qFrameCopySchedule.push(stContainer);
1176 // Release lock on the frame schedule queue.
1177 lkSchedulers.unlock();
1178
1179 // Check if point cloud queue toggle has already been set.
1180 if (!m_bPointCloudsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1181 {
1182 // Signify that the point cloud queue is not empty.
1183 m_bPointCloudsQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1184 }
1185
1186 // Return the future from the promise stored in the container.
1187 return stContainer.pCopiedFrameStatus->get_future();
1188}

◆ RequestPointCloudCopy() [2/2]

std::future< bool > ZEDCam::RequestPointCloudCopy ( cv::cuda::GpuMat cvGPUPointCloud)
overridevirtual

Grabs a point cloud image from the camera. This image has the same resolution as a normal image but with three XYZ values replacing the old color values in the 3rd dimension. The units and sign of the XYZ values are determined by ZED_MEASURE_UNITS and ZED_COORD_SYSTEM constants set in AutonomyConstants.h.

A 4th value in the 3rd dimension exists as a float32 storing the BGRA values. Each color value is 8-bits and is in this order: 00000000 00000000 00000000 00000000 = 32 bits (float32) B G R A

Puts a frame pointer into a queue so a copy of a frame from the camera can be written to it.

Parameters
cvGPUPointCloud- A reference to the cv::Mat to copy the point cloud frame to.
Returns
std::future<bool> - A future that should be waited on before the passed in frame is used. Value will be true if frame was successfully retrieved.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26

Reimplemented from ZEDCamera.

1211{
1212 // Assemble the FrameFetchContainer.
1213 containers::FrameFetchContainer<cv::cuda::GpuMat> stContainer(cvGPUPointCloud, PIXEL_FORMATS::eXYZBGRA);
1214
1215 // Acquire lock on frame copy queue.
1216 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1217 // Append frame fetch container to the schedule queue.
1218 m_qGPUFrameCopySchedule.push(stContainer);
1219 // Release lock on the frame schedule queue.
1220 lkSchedulers.unlock();
1221
1222 // Check if point cloud queue toggle has already been set.
1223 if (!m_bPointCloudsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1224 {
1225 // Signify that the point cloud queue is not empty.
1226 m_bPointCloudsQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1227 }
1228
1229 // Return the future from the promise stored in the container.
1230 return stContainer.pCopiedFrameStatus->get_future();
1231}

◆ RequestPositionalPoseCopy()

std::future< bool > ZEDCam::RequestPositionalPoseCopy ( Pose stPose)
overridevirtual

Requests the current pose of the camera relative to it's start pose or the origin of the set pose. Puts a Pose pointer into a queue so a copy of a pose from the camera can be written to it. If positional tracking is not enabled, this method will return false and the ZEDCam::Pose may be uninitialized.

Parameters
stPose- A reference to the ZEDCam::Pose object to copy the current camera pose to.
Returns
std::future<bool> - A future that should be waited on before the passed in sl::Pose is used. Value will be true if pose was successfully retrieved.
Note
If this camera is acting as the ZEDSDK Fusion master instance, then the positional pose returned will be from the Fusion instance. aka The Fused GNSS and visual inertial odometry will be returned.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Implements ZEDCamera.

1249{
1250 // Acquire read lock.
1251 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
1252 // Check if positional tracking has been enabled.
1253 if (m_slCamera.isPositionalTrackingEnabled())
1254 {
1255 // Release lock.
1256 lkCameraLock.unlock();
1257 // Assemble the data container.
1258 containers::DataFetchContainer<Pose> stContainer(stPose);
1259
1260 // Acquire lock on pose copy queue.
1261 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1262 // Append pose fetch container to the schedule queue.
1263 m_qPoseCopySchedule.push(stContainer);
1264 // Release lock on the pose schedule queue.
1265 lkSchedulers.unlock();
1266
1267 // Check if pose queue toggle has already been set.
1268 if (!m_bPosesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1269 {
1270 // Signify that the pose queue is not empty.
1271 m_bPosesQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1272 }
1273
1274 // Return the future from the promise stored in the container.
1275 return stContainer.pCopiedDataStatus->get_future();
1276 }
1277 else
1278 {
1279 // Release lock.
1280 lkCameraLock.unlock();
1281 // Submit logger message.
1282 LOG_WARNING(logging::g_qSharedLogger, "Attempted to get ZED positional pose but positional tracking is not enabled or is still initializing!");
1283
1284 // Create dummy promise to return the future.
1285 std::promise<bool> pmDummyPromise;
1286 std::future<bool> fuDummyFuture = pmDummyPromise.get_future();
1287 // Set future value.
1288 pmDummyPromise.set_value(false);
1289
1290 // Return unsuccessful.
1291 return fuDummyFuture;
1292 }
1293}
This struct is used to carry references to any datatype for scheduling and copying....
Definition FetchContainers.hpp:165

◆ RequestFusionGeoPoseCopy()

std::future< bool > ZEDCam::RequestFusionGeoPoseCopy ( sl::GeoPose &  slGeoPose)
overridevirtual

Requests the current geo pose of the camera. This method should be used to retrieve ONLY the GNSS data from the ZEDSDK's Fusion module. Puts a GeoPose pointer into a queue so a copy of a GeoPose from the camera can be written to it. If positional tracking or fusion is disabled for this camera, then this method will return false and the sl::GeoPose may be uninitialized.

Parameters
slGeoPose- A reference to the sl::GeoPose object to copy the current geo pose to.
Returns
std::future<bool> - A future that should be waited on before the passed in sl::GeoPose is used. Value will be true if geo pose was successfully retrieved.
Note
This camera must be a Fusion Master and the NavBoard must be giving accurate info to the camera for this to be functional.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-01-25

Implements ZEDCamera.

1311{
1312 // Acquire read lock.
1313 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
1314 // Check if positional tracking has been enabled.
1315 if (m_bCameraIsFusionMaster && m_slCamera.isPositionalTrackingEnabled())
1316 {
1317 // Release lock.
1318 lkCameraLock.unlock();
1319 // Assemble the data container.
1320 containers::DataFetchContainer<sl::GeoPose> stContainer(slGeoPose);
1321
1322 // Acquire lock on frame copy queue.
1323 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1324 // Append frame fetch container to the schedule queue.
1325 m_qGeoPoseCopySchedule.push(stContainer);
1326 // Release lock on the frame schedule queue.
1327 lkSchedulers.unlock();
1328
1329 // Check if pose queue toggle has already been set.
1330 if (!m_bGeoPosesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1331 {
1332 // Signify that the pose queue is not empty.
1333 m_bGeoPosesQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1334 }
1335
1336 // Return the future from the promise stored in the container.
1337 return stContainer.pCopiedDataStatus->get_future();
1338 }
1339 else
1340 {
1341 // Release lock.
1342 lkCameraLock.unlock();
1343 // Submit logger message.
1344 LOG_WARNING(logging::g_qSharedLogger,
1345 "Attempted to get ZED FUSION geo pose but positional tracking is not enabled and/or this camera was not initialized as a Fusion Master!");
1346
1347 // Create dummy promise to return the future.
1348 std::promise<bool> pmDummyPromise;
1349 std::future<bool> fuDummyFuture = pmDummyPromise.get_future();
1350 // Set future value.
1351 pmDummyPromise.set_value(false);
1352
1353 // Return unsuccessful.
1354 return fuDummyFuture;
1355 }
1356}

◆ RequestFloorPlaneCopy()

std::future< bool > ZEDCam::RequestFloorPlaneCopy ( sl::Plane &  slPlane)
overridevirtual

Requests the current floor plane of the camera relative to it's current pose. Puts a Plane pointer into a queue so a copy of the floor plane from the camera can be written to it. If positional tracking is not enabled, this method will return false and the sl::Plane may be uninitialized.

Parameters
slPlane- A reference to the sl::Plane object to copy the current camera floor plane to.
Returns
std::future<bool> - A future that should be waited on before the passed in sl::Plane is used. Value will be true if data was successfully retrieved.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-10-22

Reimplemented from ZEDCamera.

1371{
1372 // Acquire read lock.
1373 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
1374 // Check if positional tracking has been enabled.
1375 if (m_slCamera.isPositionalTrackingEnabled())
1376 {
1377 // Release lock.
1378 lkCameraLock.unlock();
1379 // Assemble the data container.
1380 containers::DataFetchContainer<sl::Plane> stContainer(slPlane);
1381
1382 // Acquire lock on pose copy queue.
1383 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1384 // Append data fetch container to the schedule queue.
1385 m_qFloorCopySchedule.push(stContainer);
1386 // Release lock on the pose schedule queue.
1387 lkSchedulers.unlock();
1388
1389 // Check if pose queue toggle has already been set.
1390 if (!m_bFloorsQueued.load(std::memory_order_relaxed))
1391 {
1392 // Signify that the pose queue is not empty.
1393 m_bFloorsQueued.store(true, std::memory_order_relaxed);
1394 }
1395
1396 // Return the future from the promise stored in the container.
1397 return stContainer.pCopiedDataStatus->get_future();
1398 }
1399 else
1400 {
1401 // Release lock.
1402 lkCameraLock.unlock();
1403 // Submit logger message.
1404 LOG_WARNING(logging::g_qSharedLogger, "Attempted to get ZED floor plane but positional tracking is not enabled!");
1405
1406 // Create dummy promise to return the future.
1407 std::promise<bool> pmDummyPromise;
1408 std::future<bool> fuDummyFuture = pmDummyPromise.get_future();
1409 // Set future value.
1410 pmDummyPromise.set_value(false);
1411
1412 // Return unsuccessful.
1413 return fuDummyFuture;
1414 }
1415}

◆ RequestSensorsCopy()

std::future< bool > ZEDCam::RequestSensorsCopy ( sl::SensorsData &  slSensorsData)
overridevirtual

Requests the most up to date sensors data from the camera. This data include IMU pose and raw values, barometer, magnetometer, and temperature data.

Parameters
slSensorsData- A reference to the sl::SensorsData struct to copy data into.
Returns
std::future<bool> - A future that should be waited on before the passed in sl::SensorsData is used. Value will be true if data was successfully retrieved.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2025-08-26

Reimplemented from ZEDCamera.

1429{
1430 // Assemble the data container.
1431 containers::DataFetchContainer<sl::SensorsData> stContainer(slSensorsData);
1432
1433 // Acquire lock on sensors copy queue.
1434 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1435 // Append sensors fetch container to the schedule queue.
1436 m_qSensorsCopySchedule.push(stContainer);
1437 // Release lock on the sensors schedule queue.
1438 lkSchedulers.unlock();
1439
1440 // Check if sensors queue toggle has already been set.
1441 if (!m_bSensorsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1442 {
1443 // Signify that the sensors queue is not empty.
1444 m_bSensorsQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1445 }
1446
1447 // Return the future from the promise stored in the container.
1448 return stContainer.pCopiedDataStatus->get_future();
1449}

◆ RequestObjectsCopy()

std::future< bool > ZEDCam::RequestObjectsCopy ( std::vector< sl::ObjectData > &  vObjectData)
overridevirtual

Requests a current copy of the tracked objects from the camera. Puts a pointer to a vector containing sl::ObjectData into a queue so a copy of a frame from the camera can be written to it.

Parameters
vObjectData- A vector that will have data copied to it containing sl::ObjectData objects.
Returns
std::future<bool> - A future that should be waited on before the passed in vector is used. Value will be true if data was successfully retrieved.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

1463{
1464 // Acquire read lock.
1465 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
1466 // Check if object detection has been enabled.
1467 if (m_slCamera.isObjectDetectionEnabled())
1468 {
1469 // Release lock.
1470 lkCameraLock.unlock();
1471 // Assemble the data container.
1473
1474 // Acquire lock on object copy queue.
1475 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1476 // Append data fetch container to the schedule queue.
1477 m_qObjectDataCopySchedule.push(stContainer);
1478 // Release lock on the object schedule queue.
1479 lkSchedulers.unlock();
1480
1481 // Check if objects queue toggle has already been set.
1482 if (!m_bObjectsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1483 {
1484 // Signify that the objects queue is not empty.
1485 m_bObjectsQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1486 }
1487
1488 // Return the future from the promise stored in the container.
1489 return stContainer.pCopiedDataStatus->get_future();
1490 }
1491 else
1492 {
1493 // Release lock.
1494 lkCameraLock.unlock();
1495 // Submit logger message.
1496 LOG_WARNING(logging::g_qSharedLogger, "Attempted to get ZED object data but object detection/tracking is not enabled!");
1497
1498 // Create dummy promise to return the future.
1499 std::promise<bool> pmDummyPromise;
1500 std::future<bool> fuDummyFuture = pmDummyPromise.get_future();
1501 // Set future value.
1502 pmDummyPromise.set_value(false);
1503
1504 // Return unsuccessful.
1505 return fuDummyFuture;
1506 }
1507}

◆ RequestBatchedObjectsCopy()

std::future< bool > ZEDCam::RequestBatchedObjectsCopy ( std::vector< sl::ObjectsBatch > &  vBatchedObjectData)
overridevirtual

If batching is enabled, this requests the normal objects and passes them to the the internal batching queue of the zed api. This performs short-term re-identification with deep learning and trajectories filtering. Batching must have been set to enabled when EnableObjectDetection() was called. Most of the time the vector will be empty and will be filled every ZED_OBJDETECTION_BATCH_LATENCY.

Parameters
vBatchedObjectData- A vector containing objects of sl::ObjectsBatch object that will have object data copied to.
Returns
std::future<bool> - A future that should be waited on before the passed in vector is used. Value will be true if data was successfully retrieved.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-30

Reimplemented from ZEDCamera.

1525{
1526 // Acquire read lock.
1527 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
1528 // Check if object detection and batching has been enabled.
1529 if (m_slCamera.isObjectDetectionEnabled() && m_slObjectDetectionBatchParams.enable)
1530 {
1531 // Release lock.
1532 lkCameraLock.unlock();
1533 // Assemble the data container.
1534 containers::DataFetchContainer<std::vector<sl::ObjectsBatch>> stContainer(vBatchedObjectData);
1535
1536 // Acquire lock on batched object copy queue.
1537 std::unique_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
1538 // Append data fetch container to the schedule queue.
1539 m_qObjectBatchedDataCopySchedule.push(stContainer);
1540 // Release lock on the data schedule queue.
1541 lkSchedulers.unlock();
1542
1543 // Check if objects queue toggle has already been set.
1544 if (!m_bBatchedObjectsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
1545 {
1546 // Signify that the objects queue is not empty.
1547 m_bBatchedObjectsQueued.store(true, ATOMIC_MEMORY_ORDER_METHOD);
1548 }
1549
1550 // Return the future from the promise stored in the container.
1551 return stContainer.pCopiedDataStatus->get_future();
1552 }
1553 else
1554 {
1555 // Release lock.
1556 lkCameraLock.unlock();
1557 // Submit logger message.
1558 LOG_WARNING(logging::g_qSharedLogger, "Attempted to get ZED batched object data but object detection/tracking is not enabled!");
1559
1560 // Create dummy promise to return the future.
1561 std::promise<bool> pmDummyPromise;
1562 std::future<bool> fuDummyFuture = pmDummyPromise.get_future();
1563 // Set future value.
1564 pmDummyPromise.set_value(false);
1565
1566 // Return unsuccessful.
1567 return fuDummyFuture;
1568 }
1569}

◆ ResetPositionalTracking()

sl::ERROR_CODE ZEDCam::ResetPositionalTracking ( )
overridevirtual

Resets the cameras X,Y,Z translation and Roll,Pitch,Yaw orientation back to 0. THINK CAREFULLY! Do you actually want to reset this? It will also realign the coordinate system to whichever way the camera happens to be facing.

Returns
sl::ERROR_CODE - Status of the positional tracking reset.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26

Implements ZEDCamera.

1582{
1583 // Create new translation to set position back to user given values.
1584 sl::Translation slZeroTranslation(0, 0, 0);
1585 // Update offset member variables.
1586 m_dPoseOffsetX = 0.0;
1587 m_dPoseOffsetY = 0.0;
1588 m_dPoseOffsetZ = 0.0;
1589 // This will reset position and coordinate frame.
1590 sl::Rotation slZeroRotation;
1591 slZeroRotation.setEulerAngles(sl::float3(0.0, 0.0, 0.0), false);
1592
1593 // Store new translation and rotation in a transform object.
1594 sl::Transform slZeroTransform(slZeroRotation, slZeroTranslation);
1595
1596 // Submit logger message.
1597 LOG_NOTICE(logging::g_qSharedLogger, "Resetting positional tracking for camera {} ({})!", sl::toString(m_slCameraModel).get(), m_unCameraSerialNumber);
1598
1599 // Acquire write lock.
1600 std::unique_lock<std::shared_mutex> lkWriteCameraLock(m_muCameraMutex);
1601 // Reset the positional tracking location of the camera.
1602 return m_slCamera.resetPositionalTracking(slZeroTransform);
1603}
Here is the call graph for this function:

◆ TrackCustomBoxObjects()

sl::ERROR_CODE ZEDCam::TrackCustomBoxObjects ( std::vector< ZedObjectData > &  vCustomObjects)
overridevirtual

A vector containing CustomBoxObjectData objects. These objects simply store information about your detected objects from an external object detection model. You will need to take your inference results and package them into a sl::CustomBoxObjectData so the the ZEDSDK can properly interpret your detections.

Giving the bounding boxes of your detected objects to the ZEDSDK will enable positional tracking and velocity estimation for each object. Even when not in view. The IDs of objects will also become persistent.

Parameters
vCustomObjects- A vector of sl::CustomBoxObjectData objects.
Returns
sl::ERROR_CODE - The return status of ingestion.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26

Reimplemented from ZEDCamera.

1622{
1623 // Create instance variables.
1624 std::vector<sl::CustomBoxObjectData> vCustomBoxData;
1625
1626 // Repack detection data into sl specific object.
1627 for (ZedObjectData stObjectData : vCustomObjects)
1628 {
1629 // Create new sl CustomBoxObjectData struct.
1630 sl::CustomBoxObjectData slCustomBox;
1631 std::vector<sl::uint2> vCorners;
1632
1633 // Assign simple attributes.
1634 slCustomBox.unique_object_id = sl::String(stObjectData.GetObjectUUID().c_str());
1635 slCustomBox.label = stObjectData.nClassNumber;
1636 slCustomBox.probability = stObjectData.fConfidence;
1637 slCustomBox.is_grounded = stObjectData.bObjectRemainsOnFloorPlane;
1638 // Repackage object corner data.
1639 vCorners.emplace_back(sl::uint2(stObjectData.cvBoundingBox.x, stObjectData.cvBoundingBox.y)); // Top-left corner
1640 vCorners.emplace_back(sl::uint2(stObjectData.cvBoundingBox.x + stObjectData.cvBoundingBox.width, stObjectData.cvBoundingBox.y)); // Top-right corner
1641 vCorners.emplace_back(sl::uint2(stObjectData.cvBoundingBox.x, stObjectData.cvBoundingBox.y + stObjectData.cvBoundingBox.height)); // Bottom-left corner
1642 vCorners.emplace_back(sl::uint2(stObjectData.cvBoundingBox.x + stObjectData.cvBoundingBox.width,
1643 stObjectData.cvBoundingBox.y + stObjectData.cvBoundingBox.height)); // Bottom-right corner
1644 slCustomBox.bounding_box_2d = vCorners;
1645
1646 // Append repackaged object to vector.
1647 vCustomBoxData.emplace_back(slCustomBox);
1648 }
1649
1650 // Acquire write lock.
1651 std::unique_lock<std::shared_mutex> lkWriteCameraLock(m_muCameraMutex);
1652 // Give the custom box data to the zed api.
1653 sl::ERROR_CODE slReturnCode = m_slCamera.ingestCustomBoxObjects(vCustomBoxData);
1654 // Release lock.
1655 lkWriteCameraLock.unlock();
1656
1657 // Check if successful.
1658 if (slReturnCode == sl::ERROR_CODE::SUCCESS)
1659 {
1660 // Submit logger message.
1661 LOG_WARNING(logging::g_qSharedLogger,
1662 "Failed to ingest new objects for camera {} ({})! sl::ERROR_CODE is: {}",
1663 sl::toString(m_slCameraModel).get(),
1664 m_unCameraSerialNumber,
1665 sl::toString(slReturnCode).get());
1666 }
1667
1668 // Return error code.
1669 return slReturnCode;
1670}
Here is the call graph for this function:

◆ RebootCamera()

sl::ERROR_CODE ZEDCam::RebootCamera ( )
overridevirtual

Performs a hardware reset of the ZED2 or ZED2i camera.

Returns
sl::ERROR_CODE - Whether or not the camera reboot was successful.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26

Implements ZEDCamera.

1681{
1682 // Acquire write lock.
1683 std::unique_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
1684 // Reboot this camera and return the status code.
1685 return sl::Camera::reboot(m_unCameraSerialNumber);
1686}

◆ SubscribeFusionToCameraUUID()

sl::FUSION_ERROR_CODE ZEDCam::SubscribeFusionToCameraUUID ( sl::CameraIdentifier &  slCameraUUID)
overridevirtual

Give a UUID for another ZEDCam, subscribe that camera to this camera's Fusion instance. This will tell this camera's Fusion instance to start ingesting and fusing data from the other camera.

Parameters
slCameraUUID- The Camera unique identifier given by the other camera's PublishCameraToFusion() method.
Returns
sl::FUSION_ERROR_CODE - Whether or not the camera and fusion module has been successfully subscribed.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-01-26

Implements ZEDCamera.

1699{
1700 // Create instance variables.
1701 sl::FUSION_ERROR_CODE slReturnCode = sl::FUSION_ERROR_CODE::MODULE_NOT_ENABLED;
1702
1703 // Check if this camera is a fusion master.
1704 if (m_bCameraIsFusionMaster)
1705 {
1706 // Acquire write lock.
1707 std::unique_lock<std::shared_mutex> lkFusionLock(m_muFusionMutex);
1708 // Subscribe this camera to fusion instance.
1709 slReturnCode = m_slFusionInstance.subscribe(slCameraUUID);
1710 // Release lock.
1711 lkFusionLock.unlock();
1712 }
1713
1714 // Check if this camera was successfully subscribed to the Fusion instance.
1715 if (slReturnCode == sl::FUSION_ERROR_CODE::SUCCESS)
1716 {
1717 // Submit logger message.
1718 LOG_DEBUG(logging::g_qSharedLogger,
1719 "Subscribed stereo camera with serial number {} to Fusion instance ran by stereo camera {} ({})!",
1720 slCameraUUID.sn,
1721 sl::toString(m_slCameraModel).get(),
1722 m_unCameraSerialNumber);
1723 }
1724 else
1725 {
1726 // Submit logger message.
1727 LOG_DEBUG(logging::g_qSharedLogger,
1728 "Unable to subscribe camera with serial number {} to FUSION instance for camera {} ({})! sl::FUSION_ERROR_CODE is: {}",
1729 slCameraUUID.sn,
1730 sl::toString(m_slCameraModel).get(),
1731 m_unCameraSerialNumber,
1732 sl::toString(slReturnCode).get());
1733 }
1734
1735 // Return the sl::FUSION_ERROR_CODE status.
1736 return slReturnCode;
1737}

◆ PublishCameraToFusion()

sl::CameraIdentifier ZEDCam::PublishCameraToFusion ( )
overridevirtual

Signal this camera to make its data available to the Fusion module and retrieve a UUID for this class's sl::Camera instance that can be used to subscribe an sl::Fusion instance to this camera later.

Returns
sl::CameraIdentifier - A globally unique identifier generated from this camera's serial number.
Note
Just calling this method does not send data to the ZEDSDK's fusion module. It just enables the capability. The camera acting as the fusion master instance must be subscribed to this camera using the SubscribeFusionToCameraUUID() method and the returned UUID.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-01-26

Implements ZEDCamera.

1754{
1755 // Acquire write lock.
1756 std::unique_lock<std::shared_mutex> lkWriteCameraLock(m_muCameraMutex);
1757 // Make this cameras data available to the Fusion module if it is later subscribed.
1758 m_slCamera.startPublishing();
1759 // Release lock.
1760 lkWriteCameraLock.unlock();
1761
1762 // Return a UUID for this camera. This is used by the camera running the master fusion instance to subscribe to the data being published.
1763 return sl::CameraIdentifier(m_unCameraSerialNumber);
1764}

◆ IngestGPSDataToFusion()

sl::FUSION_ERROR_CODE ZEDCam::IngestGPSDataToFusion ( geoops::GPSCoordinate  stNewGPSLocation)
overridevirtual

If this camera is the fusion instance master, this method can be used to ingest/process/fuse the current GNSS position of the camera with the ZEDSDK positional tracking. This allows the use of RequestGeoPose to get a high resolution and highly accurate GNSS/VIO camera pose.

Parameters
stNewGPSLocation- The current GPS location, this must be up-to-date or the realtime position of the camera in the GPS space.
Returns
sl::FUSION_ERROR_CODE - Return status from the ZEDSDK Fusion module.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-04-15

Reimplemented from ZEDCamera.

1778{
1779 // Create instance variables.
1780 sl::FUSION_ERROR_CODE slReturnCode = sl::FUSION_ERROR_CODE::FAILURE;
1781
1782 // Check if this camera is the fusion master.
1783 if (m_bCameraIsFusionMaster)
1784 {
1785 // Acquire read lock.
1786 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
1787 // Check if fusion positional tracking is enabled.
1788 if (constants::FUSION_ENABLE_GNSS_FUSION && m_slCamera.isPositionalTrackingEnabled())
1789 {
1790 // Release lock.
1791 lkCameraLock.unlock();
1792
1793 // Check if the GPS data from the NavBoard is recent.
1794 if (stNewGPSLocation.dLatitude != m_stCurrentGPSBasedPosition.dLatitude && stNewGPSLocation.dLongitude != m_stCurrentGPSBasedPosition.dLongitude &&
1795 stNewGPSLocation.d2DAccuracy != m_stCurrentGPSBasedPosition.d2DAccuracy && stNewGPSLocation.d3DAccuracy != m_stCurrentGPSBasedPosition.d3DAccuracy)
1796 {
1797 // Repack gps data int sl::GNSSData object.
1798 sl::GNSSData slGNSSData = sl::GNSSData();
1799 slGNSSData.setCoordinates(stNewGPSLocation.dLatitude, stNewGPSLocation.dLongitude, stNewGPSLocation.dAltitude, false);
1800 // Position covariance matrix expects millimeters.
1801 double dHorizontalAccuracy = stNewGPSLocation.d2DAccuracy * 1000;
1802 double dVerticalAccuracy = stNewGPSLocation.d3DAccuracy * 1000;
1803 // Calculate the covariance matrix from the 2D and 3D accuracies.
1804 slGNSSData.position_covariance = {dHorizontalAccuracy * dHorizontalAccuracy,
1805 0.0,
1806 0.0,
1807 0.0,
1808 dHorizontalAccuracy * dHorizontalAccuracy,
1809 0.0,
1810 0.0,
1811 0.0,
1812 dVerticalAccuracy * dVerticalAccuracy};
1813
1814 // Set GNSS timestamp from input GPSCoordinate data. sl::GNSSData expects time since epoch.
1815 slGNSSData.ts = sl::Timestamp(std::chrono::time_point_cast<std::chrono::nanoseconds>(stNewGPSLocation.tmTimestamp).time_since_epoch().count());
1816
1817 // Get the GNSS fix type status from the given GPS coordinate.
1818 switch (stNewGPSLocation.eCoordinateAccuracyFixType)
1819 {
1820 case geoops::PositionFixType::eNoFix:
1821 {
1822 slGNSSData.gnss_status = sl::GNSS_STATUS::SINGLE;
1823 slGNSSData.gnss_mode = sl::GNSS_MODE::NO_FIX;
1824 break;
1825 }
1826 case geoops::PositionFixType::eDeadReckoning:
1827 {
1828 slGNSSData.gnss_status = sl::GNSS_STATUS::PPS;
1829 slGNSSData.gnss_mode = sl::GNSS_MODE::NO_FIX;
1830 break;
1831 }
1832 case geoops::PositionFixType::eFix2D:
1833 {
1834 slGNSSData.gnss_status = sl::GNSS_STATUS::PPS;
1835 slGNSSData.gnss_mode = sl::GNSS_MODE::FIX_2D;
1836 break;
1837 }
1838 case geoops::PositionFixType::eFix3D:
1839 {
1840 slGNSSData.gnss_status = sl::GNSS_STATUS::RTK_FIX;
1841 slGNSSData.gnss_mode = sl::GNSS_MODE::FIX_3D;
1842 break;
1843 }
1844 case geoops::PositionFixType::eGNSSDeadReckoningCombined:
1845 {
1846 slGNSSData.gnss_status = sl::GNSS_STATUS::RTK_FIX;
1847 slGNSSData.gnss_mode = sl::GNSS_MODE::FIX_3D;
1848 break;
1849 }
1850 case geoops::PositionFixType::eTimeOnly:
1851 {
1852 slGNSSData.gnss_status = sl::GNSS_STATUS::RTK_FIX;
1853 slGNSSData.gnss_mode = sl::GNSS_MODE::FIX_3D;
1854 break;
1855 }
1856 default:
1857 {
1858 slGNSSData.gnss_status = sl::GNSS_STATUS::RTK_FIX;
1859 slGNSSData.gnss_mode = sl::GNSS_MODE::FIX_3D;
1860 break;
1861 }
1862 }
1863 // Check if fix is based off of differential GPS calculations.
1864 if (stNewGPSLocation.bIsDifferential)
1865 {
1866 slGNSSData.gnss_status = sl::GNSS_STATUS::DGNSS;
1867 }
1868
1869 // Acquire write lock.
1870 std::unique_lock<std::shared_mutex> lkFusionLock(m_muFusionMutex);
1871 // Publish GNSS data to fusion from the NavBoard.
1872 slReturnCode = m_slFusionInstance.ingestGNSSData(slGNSSData);
1873 // Release lock.
1874 lkFusionLock.unlock();
1875 // Check if the GNSS data was successfully ingested by the Fusion instance.
1876 if (slReturnCode != sl::FUSION_ERROR_CODE::SUCCESS)
1877 {
1878 // Covariance error.
1879 if (slReturnCode == sl::FUSION_ERROR_CODE::GNSS_DATA_COVARIANCE_MUST_VARY || slReturnCode == sl::FUSION_ERROR_CODE::INVALID_COVARIANCE)
1880 {
1881 // Submit logger message.
1882 LOG_WARNING(logging::g_qSharedLogger,
1883 "Unable to ingest fusion GNSS data for camera {} ({})! sl::Fusion positional tracking may be inaccurate! sl::FUSION_ERROR_CODE "
1884 "is: {}({}). Current accuracy data is 2D: {}, 3D {}.",
1885 sl::toString(m_slCameraModel).get(),
1886 m_unCameraSerialNumber,
1887 sl::toString(slReturnCode).get(),
1888 static_cast<int>(slReturnCode),
1889 stNewGPSLocation.d2DAccuracy,
1890 stNewGPSLocation.d3DAccuracy);
1891 }
1892 else if (slReturnCode != sl::FUSION_ERROR_CODE::NO_NEW_DATA_AVAILABLE)
1893 {
1894 // Submit logger message.
1895 LOG_WARNING(logging::g_qSharedLogger,
1896 "Unable to ingest fusion GNSS data for camera {} ({})! sl::Fusion positional tracking may be inaccurate! sl::FUSION_ERROR_CODE "
1897 "is: {}({})",
1898 sl::toString(m_slCameraModel).get(),
1899 m_unCameraSerialNumber,
1900 sl::toString(slReturnCode).get(),
1901 static_cast<int>(slReturnCode));
1902 }
1903 }
1904 else
1905 {
1906 // Acquire write lock.
1907 std::shared_lock<std::shared_mutex> lkFusionLock(m_muFusionMutex);
1908 // Get the current status of the fusion positional tracking.
1909 sl::FusedPositionalTrackingStatus slFusionPoseTrackStatus = m_slFusionInstance.getFusedPositionalTrackingStatus();
1910 // Release lock.
1911 lkFusionLock.unlock();
1912 // Submit logger message. DEBUG log the current fused position tracking state.
1913 LOG_DEBUG(logging::g_qSharedLogger,
1914 "PoseTrack Fusion Status: {}, GNSS Fusion Status: {}, VIO SpatialMemory Status: {}",
1915 sl::toString(slFusionPoseTrackStatus.tracking_fusion_status).get(),
1916 sl::toString(slFusionPoseTrackStatus.gnss_fusion_status).get(),
1917 sl::toString(slFusionPoseTrackStatus.spatial_memory_status).get());
1918 }
1919
1920 // Update current GPS position.
1921 m_stCurrentGPSBasedPosition = stNewGPSLocation;
1922 }
1923 }
1924 else
1925 {
1926 // Release lock.
1927 lkCameraLock.unlock();
1928 // Submit logger message.
1929 LOG_ERROR(logging::g_qSharedLogger,
1930 "Cannot ingest GNSS data because camera {} ({}) does not have positional tracking enabled or constants::FUSION_ENABLE_GNSS_FUSION is false!",
1931 sl::toString(m_slCameraModel).get(),
1932 m_unCameraSerialNumber);
1933 }
1934 }
1935 else
1936 {
1937 // Submit logger message.
1938 LOG_WARNING(logging::g_qSharedLogger,
1939 "Cannot ingest GNSS data because camera {} ({}) is not an sl::Fusion master!",
1940 sl::toString(m_slCameraModel).get(),
1941 m_unCameraSerialNumber);
1942 }
1943
1944 return slReturnCode;
1945}
Here is the call graph for this function:

◆ EnablePositionalTracking()

sl::ERROR_CODE ZEDCam::EnablePositionalTracking ( const float  fExpectedCameraHeightFromFloorTolerance = constants::ZED_DEFAULT_FLOOR_PLANE_ERROR)
overridevirtual

Enable the positional tracking functionality of the camera.

Parameters
fExpectedCameraHeightFromFloorTolerance- The expected height of the camera from the floor. This aids with floor plane detection.
Returns
sl::ERROR_CODE - Whether or not positional tracking was successfully enabled.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-10-22

Implements ZEDCamera.

1958{
1959 // Assign member variable.
1960 m_fExpectedCameraHeightFromFloorTolerance = fExpectedCameraHeightFromFloorTolerance;
1961
1962 // Acquire write lock.
1963 std::unique_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
1964 // Enable pose tracking and store return code.
1965 sl::ERROR_CODE slReturnCode = m_slCamera.enablePositionalTracking(m_slPoseTrackingParams);
1966 // Release lock.
1967 lkCameraLock.unlock();
1968
1969 // Check if positional tracking was enabled properly.
1970 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
1971 {
1972 // Submit logger message.
1973 LOG_ERROR(logging::g_qSharedLogger,
1974 "Failed to enable positional tracking for camera {} ({})! sl::ERROR_CODE is: {}",
1975 sl::toString(m_slCameraModel).get(),
1976 m_unCameraSerialNumber,
1977 sl::toString(slReturnCode).get());
1978 }
1979 // Check if fusion positional tracking should be enabled for this camera.
1980 else if (m_bCameraIsFusionMaster)
1981 {
1982 // Acquire write lock.
1983 std::unique_lock<std::shared_mutex> lkFusionLock(m_muFusionMutex);
1984 // Enable fusion positional tracking.
1985 sl::FUSION_ERROR_CODE slFusionReturnCode = m_slFusionInstance.enablePositionalTracking(m_slFusionPoseTrackingParams);
1986 // Release lock.
1987 lkFusionLock.unlock();
1988
1989 // Check if the fusion positional tracking was enabled successfully.
1990 if (slFusionReturnCode != sl::FUSION_ERROR_CODE::SUCCESS)
1991 {
1992 // Submit logger message.
1993 LOG_ERROR(logging::g_qSharedLogger,
1994 "Failed to enable fusion positional tracking for camera {} ({})! sl::FUSION_ERROR_CODE is: {}",
1995 sl::toString(m_slCameraModel).get(),
1996 m_unCameraSerialNumber,
1997 sl::toString(slFusionReturnCode).get());
1998 }
1999 }
2000
2001 // Return error code.
2002 return slReturnCode;
2003}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ DisablePositionalTracking()

void ZEDCam::DisablePositionalTracking ( )
overridevirtual

Disable to positional tracking functionality of the camera.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-26

Implements ZEDCamera.

2013{
2014 // Check if fusion positional tracking should be enabled for this camera.
2015 if (m_bCameraIsFusionMaster)
2016 {
2017 // Acquire write lock.
2018 std::unique_lock<std::shared_mutex> lkFusionLock(m_muFusionMutex);
2019 // Enable fusion positional tracking.
2020 m_slFusionInstance.disablePositionalTracking();
2021 // Release lock.
2022 lkFusionLock.unlock();
2023 }
2024 // Acquire write lock.
2025 std::unique_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2026 // Disable pose tracking.
2027 m_slCamera.disablePositionalTracking();
2028}

◆ SetPositionalPose()

void ZEDCam::SetPositionalPose ( const double  dX,
const double  dY,
const double  dZ,
const double  dXO,
const double  dYO,
const double  dZO 
)
overridevirtual

Sets the pose of the positional tracking of the camera. XYZ will point in their respective directions according to ZED_COORD_SYSTEM defined in AutonomyConstants.h.

Warning: This method is slow and should not be called in a loop. Setting the pose will temporarily block the entire camera from grabbed or copying frames to new threads. This method should only be called occasionally when absolutely needed.

Parameters
dX- The X position of the camera in ZED_MEASURE_UNITS.
dY- The Y position of the camera in ZED_MEASURE_UNITS.
dZ- The Z position of the camera in ZED_MEASURE_UNITS.
dXO- The tilt of the camera around the X axis in degrees. (0-360)
dYO- The tilt of the camera around the Y axis in degrees. (0-360)
dZO- The tilt of the camera around the Z axis in degrees. (0-360)
Returns
sl::ERROR_CODE - Whether or not the pose was set successfully.
Bug:
The ZEDSDK currently cannot handle resetting the positional pose with large translational (dX, dY, dZ) values without breaking positional tracking. This is because the values are floats and not doubles. To fix this I (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om), have decided to just handle the translation offsets internally. So when SetPositionalPose() is called is assigns the dX, dY, and dZ values to private member variables of this class, then the offsets are added the the pose in the PooledLinearCode() under the pose requests section. If StereoLabs fixes this in the future, I will go back to using the sl::Translation to reset positional tracking.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Implements ZEDCamera.

2057{
2058 // Update offset member variables.
2059 m_dPoseOffsetX = dX - m_slCameraPose.getTranslation().x;
2060 m_dPoseOffsetY = dY - m_slCameraPose.getTranslation().y;
2061 m_dPoseOffsetZ = dZ - m_slCameraPose.getTranslation().z;
2062 // Find the angular distance from current and desired pose angles. This is complicated because zed uses different angle ranges.
2063 m_dPoseOffsetXO =
2064 numops::InputAngleModulus(numops::AngularDifference(numops::InputAngleModulus<double>(m_slCameraPose.getEulerAngles(false).x, 0.0, 360.0), dXO), 0.0, 360.0);
2065 m_dPoseOffsetYO =
2066 numops::InputAngleModulus(numops::AngularDifference(numops::InputAngleModulus<double>(m_slCameraPose.getEulerAngles(false).y, 0.0, 360.0), dYO), 0.0, 360.0);
2067 m_dPoseOffsetZO =
2068 numops::InputAngleModulus(numops::AngularDifference(numops::InputAngleModulus<double>(m_slCameraPose.getEulerAngles(false).z, 0.0, 360.0), dZO), 0.0, 360.0);
2069}
constexpr T InputAngleModulus(T tValue, T tMinValue, T tMaxValue)
Calculates the modulus of an input angle.
Definition NumberOperations.hpp:165
constexpr T AngularDifference(T tFirstValue, T tSecondValue)
Calculates the distance in degrees between two angles. This function accounts for wrap around so that...
Definition NumberOperations.hpp:204
Here is the call graph for this function:

◆ EnableSpatialMapping()

sl::ERROR_CODE ZEDCam::EnableSpatialMapping ( )
overridevirtual

Enabled the spatial mapping feature of the camera. Pose tracking will be enabled if it is not already.

Returns
sl::ERROR_CODE - Whether or not spatial mapping was successfully enabled.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

2081{
2082 // Create instance variables.
2083 sl::Pose slCameraPose;
2084 sl::ERROR_CODE slReturnCode = sl::ERROR_CODE::SUCCESS;
2085
2086 // Acquire read lock.
2087 std::shared_lock<std::shared_mutex> lkReadCameraLock(m_muCameraMutex);
2088 // Check if positional tracking is enabled.
2089 if (!m_slCamera.isPositionalTrackingEnabled())
2090 {
2091 // Release lock.
2092 lkReadCameraLock.unlock();
2093 // Enable positional tracking.
2094 slReturnCode = this->EnablePositionalTracking();
2095 }
2096 else
2097 {
2098 // Release lock.
2099 lkReadCameraLock.unlock();
2100 }
2101
2102 // Check if positional tracking is or was enabled successfully.
2103 if (slReturnCode == sl::ERROR_CODE::SUCCESS)
2104 {
2105 // Acquire write lock.
2106 std::unique_lock<std::shared_mutex> lkWriteCameraLock(m_muCameraMutex);
2107 // Call camera grab function once to ensure the camera is initialized with data.
2108 m_slCamera.grab(m_slRuntimeParams);
2109 // Enable spatial mapping.
2110 slReturnCode = m_slCamera.enableSpatialMapping(m_slSpatialMappingParams);
2111 // Release lock.
2112 lkWriteCameraLock.unlock();
2113
2114 // Check if positional tracking was enabled properly.
2115 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
2116 {
2117 // Submit logger message.
2118 LOG_ERROR(logging::g_qSharedLogger,
2119 "Failed to enabled spatial mapping for camera {} ({})! sl::ERROR_CODE is: {}",
2120 sl::toString(m_slCameraModel).get(),
2121 m_unCameraSerialNumber,
2122 sl::toString(slReturnCode).get());
2123 }
2124 }
2125 else
2126 {
2127 // Submit logger message.
2128 LOG_ERROR(logging::g_qSharedLogger,
2129 "Failed to enabled spatial mapping for camera {} ({}) because positional tracking could not be enabled! sl::ERROR_CODE is: {}",
2130 sl::toString(m_slCameraModel).get(),
2131 m_unCameraSerialNumber,
2132 sl::toString(slReturnCode).get());
2133 }
2134
2135 // Return error code.
2136 return slReturnCode;
2137}
sl::ERROR_CODE EnablePositionalTracking(const float fExpectedCameraHeightFromFloorTolerance=constants::ZED_DEFAULT_FLOOR_PLANE_ERROR) override
Enable the positional tracking functionality of the camera.
Definition ZEDCam.cpp:1957
Here is the call graph for this function:
Here is the caller graph for this function:

◆ DisableSpatialMapping()

void ZEDCam::DisableSpatialMapping ( )
overridevirtual

Disabled the spatial mapping feature of the camera.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

2147{
2148 // Acquire write lock.
2149 std::unique_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2150 // Disable spatial mapping.
2151 m_slCamera.disableSpatialMapping();
2152}

◆ EnableObjectDetection()

sl::ERROR_CODE ZEDCam::EnableObjectDetection ( const bool  bEnableBatching = false)
overridevirtual

Enables the object detection and tracking feature of the camera.

Returns
sl::ERROR_CODE - Whether or not object detection/tracking was successfully enabled.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

2163{
2164 // Check if batching should be turned on.
2165 bEnableBatching ? m_slObjectDetectionBatchParams.enable = true : m_slObjectDetectionBatchParams.enable = false;
2166 // Give batch params to detection params.
2167 m_slObjectDetectionParams.batch_parameters = m_slObjectDetectionBatchParams;
2168
2169 // Acquire write lock.
2170 std::unique_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2171 // Enable object detection.
2172 sl::ERROR_CODE slReturnCode = m_slCamera.enableObjectDetection(m_slObjectDetectionParams);
2173 // Release lock.
2174 lkCameraLock.unlock();
2175
2176 // Check if positional tracking was enabled properly.
2177 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
2178 {
2179 // Submit logger message.
2180 LOG_ERROR(logging::g_qSharedLogger,
2181 "Failed to enabled object detection for camera {} ({})! sl::ERROR_CODE is: {}",
2182 sl::toString(m_slCameraModel).get(),
2183 m_unCameraSerialNumber,
2184 sl::toString(slReturnCode).get());
2185 }
2186
2187 // Return error code.
2188 return slReturnCode;
2189}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ DisableObjectDetection()

void ZEDCam::DisableObjectDetection ( )
overridevirtual

Disables the object detection and tracking feature of the camera.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

2199{
2200 // Acquire write lock.
2201 std::unique_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2202 // Disable object detection and tracking.
2203 m_slCamera.disableObjectDetection();
2204}

◆ GetCameraIsOpen()

bool ZEDCam::GetCameraIsOpen ( )
overridevirtual

Accessor for the current status of the camera.

Returns
true - Camera is currently opened and functional.
false - Camera is not opened and/or connected.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Implements Camera< cv::Mat >.

2216{
2217 // Acquire read lock.
2218 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2219 return this->GetThreadState() == AutonomyThreadState::eRunning && m_slCamera.isOpened();
2220}
AutonomyThreadState GetThreadState() const
Accessor for the Threads State private member.
Definition AutonomyThread.hpp:214
Here is the call graph for this function:
Here is the caller graph for this function:

◆ GetUsingGPUMem()

bool ZEDCam::GetUsingGPUMem ( ) const
overridevirtual

Accessor for if this ZED is storing it's frames in GPU memory.

Returns
true - Using GPU memory for mats.
false - Using CPU memory for mats.
Author
ClayJay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-09-09

Reimplemented from ZEDCamera.

2232{
2233 // Check if we are using GPU memory.
2234 return m_slMemoryType == sl::MEM::GPU;
2235}

◆ GetCameraModel()

std::string ZEDCam::GetCameraModel ( )
overridevirtual

Accessor for the model enum from the ZEDSDK and represents the camera model as a string.

Returns
std::string - The model of the zed camera. Possible values: ZED, ZED_MINI, ZED_2, ZED_2i, ZED_X, ZED_X_MINI, UNDEFINED_UNKNOWN
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Implements ZEDCamera.

2247{
2248 // Acquire read lock.
2249 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2250 // Check if the camera is opened.
2251 if (m_slCamera.isOpened())
2252 {
2253 // Release lock.
2254 lkCameraLock.unlock();
2255 // Convert camera model to a string and return.
2256 return sl::toString(m_slCameraModel).get();
2257 }
2258 else
2259 {
2260 // Release lock.
2261 lkCameraLock.unlock();
2262 // Return the model string to show camera isn't opened.
2263 return "NOT_OPENED";
2264 }
2265}
Here is the caller graph for this function:

◆ GetCameraSerial()

unsigned int ZEDCam::GetCameraSerial ( )
overridevirtual

Accessor for the camera's serial number.

Returns
unsigned int - The serial number of the camera.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

2276{
2277 // Return the model string to show camera isn't opened.
2278 return m_unCameraSerialNumber;
2279}
Here is the caller graph for this function:

◆ GetPositionalTrackingEnabled()

bool ZEDCam::GetPositionalTrackingEnabled ( )
overridevirtual

Accessor for if the positional tracking functionality of the camera has been enabled and functioning.

Returns
true - Positional tracking is enabled.
false - Positional tracking is not enabled.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Implements ZEDCamera.

2292{
2293 // Acquire read lock.
2294 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2295 return m_slCamera.isPositionalTrackingEnabled() && m_slCamera.getPositionalTrackingStatus().odometry_status == sl::ODOMETRY_STATUS::OK;
2296}

◆ GetPositionalTrackingState()

sl::PositionalTrackingStatus ZEDCam::GetPositionalTrackingState ( )
overridevirtual

Accessor for the current positional tracking status of the camera.

Returns
sl::PositionalTrackingStatus - The sl::PositionalTrackingStatus struct storing information about the current VIO positional tracking state.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-04-20

Reimplemented from ZEDCamera.

2308{
2309 // Acquire read lock.
2310 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2311 return m_slCamera.getPositionalTrackingStatus();
2312}

◆ GetFusedPositionalTrackingState()

sl::FusedPositionalTrackingStatus ZEDCam::GetFusedPositionalTrackingState ( )
overridevirtual

Accessor for the current positional tracking status of the fusion instance.

Returns
sl::FusedPositionalTrackingStatus - The sl::FusedPositionalTrackingStatus struct storing information about the current VIO and GNSS fusion positional tracking state.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2024-04-22

Reimplemented from ZEDCamera.

2324{
2325 // Acquire read lock.
2326 std::shared_lock<std::shared_mutex> lkFusionLock(m_muFusionMutex);
2327 return m_slFusionInstance.getFusedPositionalTrackingStatus();
2328}

◆ GetSpatialMappingState()

sl::SPATIAL_MAPPING_STATE ZEDCam::GetSpatialMappingState ( )
overridevirtual

Accessor for the current state of the camera's spatial mapping feature.

Returns
sl::SPATIAL_MAPPING_STATE - The enum value of the spatial mapping state.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

2339{
2340 // Acquire read lock.
2341 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2342 // Return the current spatial mapping state of the camera.
2343 return m_slCamera.getSpatialMappingState();
2344}

◆ ExtractSpatialMapAsync()

sl::SPATIAL_MAPPING_STATE ZEDCam::ExtractSpatialMapAsync ( std::future< sl::Mesh > &  fuMeshFuture)
overridevirtual

Retrieve the built spatial map from the camera. Spatial mapping must be enabled. This method takes in an std::future<sl::FusedPointCloud> to eventually store the map in. It returns a enum code representing the successful scheduling of building the map. Any code other than SPATIAL_MAPPING_STATE::OK means the future will never be filled.

Parameters
std::future<sl::Mesh>- The future to eventually store the map in.
Returns
sl::SPATIAL_MAPPING_STATE - Whether or not the building of the map was successfully scheduled. Anything other than OK means the future will never be filled.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

2360{
2361 // Acquire read lock.
2362 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2363 // Get and store current state of spatial mapping.
2364 sl::SPATIAL_MAPPING_STATE slReturnState = m_slCamera.getSpatialMappingState();
2365
2366 // Check if spatial mapping has been enabled and ready
2367 if (slReturnState == sl::SPATIAL_MAPPING_STATE::OK)
2368 {
2369 // Request that the ZEDSDK begin processing the spatial map for export.
2370 m_slCamera.requestSpatialMapAsync();
2371
2372 // Start an async thread to wait for spatial map processing to finish. Return resultant future object.
2373 fuMeshFuture = std::async(std::launch::async,
2374 [this]()
2375 {
2376 // Create instance variables.
2377 sl::Mesh slSpatialMap;
2378
2379 // Loop until map is finished generating.
2380 while (m_slCamera.getSpatialMapRequestStatusAsync() == sl::ERROR_CODE::FAILURE)
2381 {
2382 // Sleep for 10ms.
2383 std::this_thread::sleep_for(std::chrono::milliseconds(10));
2384 }
2385
2386 // Check if the spatial map was exported successfully.
2387 if (m_slCamera.getSpatialMapRequestStatusAsync() == sl::ERROR_CODE::SUCCESS)
2388 {
2389 // Get and store the spatial map.
2390 m_slCamera.retrieveSpatialMapAsync(slSpatialMap);
2391
2392 // Return spatial map.
2393 return slSpatialMap;
2394 }
2395 else
2396 {
2397 // Submit logger message.
2398 LOG_ERROR(logging::g_qSharedLogger,
2399 "Failed to extract ZED spatial map. sl::ERROR_CODE is: {}",
2400 sl::toString(m_slCamera.getSpatialMapRequestStatusAsync()).get());
2401
2402 // Return empty point cloud.
2403 return sl::Mesh();
2404 }
2405 });
2406 }
2407 else
2408 {
2409 // Submit logger message.
2410 LOG_WARNING(logging::g_qSharedLogger, "ZED spatial mapping was never enabled, can't extract spatial map!");
2411 }
2412
2413 // Return current spatial mapping state.
2414 return slReturnState;
2415}

◆ GetObjectDetectionEnabled()

bool ZEDCam::GetObjectDetectionEnabled ( )
overridevirtual

Accessor for if the cameras object detection and tracking feature is enabled.

Returns
true - Object detection and tracking is enabled.
false - Object detection and tracking is not enabled.
Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-08-27

Reimplemented from ZEDCamera.

2427{
2428 // Acquire read lock.
2429 std::shared_lock<std::shared_mutex> lkCameraLock(m_muCameraMutex);
2430 return m_slCamera.isObjectDetectionEnabled();
2431}

◆ ThreadedContinuousCode()

void ZEDCam::ThreadedContinuousCode ( )
overrideprivatevirtual

The code inside this private method runs in a separate thread, but still has access to this*. This method continuously calls the grab() function of the ZEDSDK, which updates all frames (RGB, depth, cloud) and all other data such as positional and spatial mapping. Then a thread pool is started and joined once per iteration to mass copy the frames and/or measure to any other thread waiting in the queues.

Author
clayjay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-09-01

Implements AutonomyThread< void >.

288{
289 // Acquire read lock for camera object.
290 std::shared_lock<std::shared_mutex> lkReadCameraLock(m_muCameraMutex);
291 // Check if camera is opened.
292 if (!m_slCamera.isOpened())
293 {
294 // Release lock.
295 lkReadCameraLock.unlock();
296
297 // If this is the first iteration of the thread the camera probably isn't present so stop thread to save resources.
298 if (this->GetThreadState() == AutonomyThreadState::eStarting)
299 {
300 // Shutdown threads for this ZEDCam.
301 this->RequestStop();
302 // Submit logger message.
303 LOG_CRITICAL(logging::g_qSharedLogger,
304 "Camera start was attempted for ZED camera with serial number {}, but camera never properly opened or it has been closed/rebooted!",
305 m_unCameraSerialNumber);
306 }
307 else
308 {
309 // Create instance variables.
310 static bool bReopenAlreadyChecked = false;
311 std::chrono::time_point tmCurrentTime = std::chrono::system_clock::now();
312 // Convert time point to seconds since epoch
313 int nTimeSinceEpoch = std::chrono::duration_cast<std::chrono::seconds>(tmCurrentTime.time_since_epoch()).count();
314
315 // Only try to reopen camera every 5 seconds.
316 if (nTimeSinceEpoch % 5 == 0 && !bReopenAlreadyChecked)
317 {
318 // Acquire write lock for camera object.
319 std::unique_lock<std::shared_mutex> lkWriteCameraLock(m_muCameraMutex);
320 // Attempt to reopen camera.
321 sl::ERROR_CODE slReturnCode = m_slCamera.open(m_slCameraParams);
322 // Release lock.
323 lkWriteCameraLock.unlock();
324
325 // Check if camera was reopened.
326 if (slReturnCode == sl::ERROR_CODE::SUCCESS)
327 {
328 // Submit logger message.
329 LOG_INFO(logging::g_qSharedLogger, "ZED stereo camera with serial number {} has been reconnected and reopened!", m_unCameraSerialNumber);
330
331 // Check if this camera is a fusion master.
332 if (m_bCameraIsFusionMaster)
333 {
334 // Acquire write lock for camera object.
335 std::unique_lock<std::shared_mutex> lkWriteCameraLock(m_muCameraMutex);
336 // Enable odometry publishing for this ZED camera.
337 m_slCamera.startPublishing();
338 // Release lock.
339 lkWriteCameraLock.unlock();
340 }
341
342 // Check if positional tracking was enabled.
343 if (!m_slCamera.isPositionalTrackingEnabled())
344 {
345 slReturnCode = this->EnablePositionalTracking();
346 }
347 else
348 {
349 // Submit logger message.
350 LOG_ERROR(logging::g_qSharedLogger,
351 "After reopening ZED stereo camera with serial number {}, positional tracking failed to reinitialize. sl::ERROR_CODE is: {}",
352 m_unCameraSerialNumber,
353 sl::toString(slReturnCode).get());
354 }
355 // Check if spatial mapping was enabled.
356 if (m_slCamera.getSpatialMappingState() != sl::SPATIAL_MAPPING_STATE::OK)
357 {
358 slReturnCode = this->EnableSpatialMapping();
359 }
360 else
361 {
362 // Submit logger message.
363 LOG_ERROR(logging::g_qSharedLogger,
364 "After reopening ZED stereo camera with serial number {}, spatial mapping failed to reinitialize. sl::ERROR_CODE is: {}",
365 m_unCameraSerialNumber,
366 sl::toString(slReturnCode).get());
367 }
368 // Check if object detection was enabled.
369 if (!m_slCamera.isObjectDetectionEnabled())
370 {
371 slReturnCode = this->EnableObjectDetection();
372 }
373 else
374 {
375 // Submit logger message.
376 LOG_ERROR(logging::g_qSharedLogger,
377 "After reopening ZED stereo camera with serial number {}, object detection failed to reinitialize. sl::ERROR_CODE is: {}",
378 m_unCameraSerialNumber,
379 sl::toString(slReturnCode).get());
380 }
381 }
382 else
383 {
384 // Submit logger message.
385 LOG_WARNING(logging::g_qSharedLogger,
386 "Attempt to reopen ZED stereo camera with serial number {} has failed! Trying again in 5 seconds...",
387 m_unCameraSerialNumber);
388 }
389
390 // Set toggle.
391 bReopenAlreadyChecked = true;
392 }
393 else if (nTimeSinceEpoch % 5 != 0)
394 {
395 // Reset toggle.
396 bReopenAlreadyChecked = false;
397 }
398 }
399 }
400 else
401 {
402 // Release lock.
403 lkReadCameraLock.unlock();
404 // Acquire write lock for camera object.
405 std::unique_lock<std::shared_mutex> lkWriteCameraLock(m_muCameraMutex);
406 // Call generalized update method of zed api.
407 sl::ERROR_CODE slReturnCode = m_slCamera.grab(m_slRuntimeParams);
408 // Release camera lock.
409 lkWriteCameraLock.unlock();
410
411 // Check if this camera is the fusion master instance. Feed data to sl::Fusion.
412 if (m_bCameraIsFusionMaster)
413 {
414 // Acquire write lock.
415 std::unique_lock<std::shared_mutex> lkFusionLock(m_muFusionMutex);
416 // Call generalized process method of Fusion instance.
417 sl::FUSION_ERROR_CODE slReturnCode = m_slFusionInstance.process();
418 // Release lock.
419 lkFusionLock.unlock();
420
421 // Check if fusion data was processed correctly.
422 if (slReturnCode != sl::FUSION_ERROR_CODE::SUCCESS && slReturnCode != sl::FUSION_ERROR_CODE::NO_NEW_DATA_AVAILABLE)
423 {
424 // Submit logger message.
425 LOG_WARNING(logging::g_qSharedLogger,
426 "Unable to process fusion data for camera {} ({})! sl::FUSION_ERROR_CODE is: {}",
427 sl::toString(m_slCameraModel).get(),
428 m_unCameraSerialNumber,
429 sl::toString(slReturnCode).get());
430 }
431 }
432
433 // Check if new frame was computed successfully.
434 if (slReturnCode == sl::ERROR_CODE::SUCCESS)
435 {
436 // Check if normal frames have been requested.
437 if (m_bNormalFramesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
438 {
439 // Grab regular image and store it in member variable.
440 slReturnCode = m_slCamera.retrieveImage(m_slFrame, constants::ZED_RETRIEVE_VIEW, m_slMemoryType, sl::Resolution(m_nPropResolutionX, m_nPropResolutionY));
441 // Check that the regular frame was retrieved successfully.
442 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
443 {
444 // Submit logger message.
445 LOG_WARNING(logging::g_qSharedLogger,
446 "Unable to retrieve new frame image for stereo camera {} ({})! sl::ERROR_CODE is: {}",
447 sl::toString(m_slCameraModel).get(),
448 m_unCameraSerialNumber,
449 sl::toString(slReturnCode).get());
450 }
451 }
452
453 // Check if depth frames have been requested.
454 if (m_bDepthFramesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
455 {
456 // Grab depth measure and store it in member variable.
457 slReturnCode = m_slCamera.retrieveMeasure(m_slDepthMeasure, m_slDepthMeasureType, m_slMemoryType, sl::Resolution(m_nPropResolutionX, m_nPropResolutionY));
458 // Check that the regular frame was retrieved successfully.
459 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
460 {
461 // Submit logger message.
462 LOG_WARNING(logging::g_qSharedLogger,
463 "Unable to retrieve new depth measure for stereo camera {} ({})! sl::ERROR_CODE is: {}",
464 sl::toString(m_slCameraModel).get(),
465 m_unCameraSerialNumber,
466 sl::toString(slReturnCode).get());
467 }
468
469 // Grab depth grayscale image and store it in member variable.
470 slReturnCode = m_slCamera.retrieveImage(m_slDepthImage, sl::VIEW::DEPTH, m_slMemoryType, sl::Resolution(m_nPropResolutionX, m_nPropResolutionY));
471 // Check that the regular frame was retrieved successfully.
472 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
473 {
474 // Submit logger message.
475 LOG_WARNING(logging::g_qSharedLogger,
476 "Unable to retrieve new depth image for stereo camera {} ({})! sl::ERROR_CODE is: {}",
477 sl::toString(m_slCameraModel).get(),
478 m_unCameraSerialNumber,
479 sl::toString(slReturnCode).get());
480 }
481 }
482
483 // Check if point clouds have been requested.
484 if (m_bPointCloudsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
485 {
486 // Grab regular resized image and store it in member variable.
487 slReturnCode = m_slCamera.retrieveMeasure(m_slPointCloud, sl::MEASURE::XYZBGRA, m_slMemoryType, sl::Resolution(m_nPropResolutionX, m_nPropResolutionY));
488 // Check that the regular frame was retrieved successfully.
489 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
490 {
491 // Submit logger message.
492 LOG_WARNING(logging::g_qSharedLogger,
493 "Unable to retrieve new point cloud for stereo camera {} ({})! sl::ERROR_CODE is: {}",
494 sl::toString(m_slCameraModel).get(),
495 m_unCameraSerialNumber,
496 sl::toString(slReturnCode).get());
497 }
498 }
499
500 // Check if positional tracking is enabled.
501 if (m_slCamera.isPositionalTrackingEnabled())
502 {
503 // Check if poses have been requested.
504 if (m_bPosesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
505 {
506 // Create instance variable for storing the result of retrieving the pose.
507 sl::POSITIONAL_TRACKING_STATE slPoseTrackReturnCode;
508
509 // Check if this camera has Fusion instance enabled.
510 if (m_bCameraIsFusionMaster)
511 {
512 // Get tracking pose from Fusion.
513 slPoseTrackReturnCode = m_slFusionInstance.getPosition(m_slCameraPose, sl::REFERENCE_FRAME::WORLD);
514 }
515 else
516 {
517 // Get normal vision tracking pose from camera.
518 slPoseTrackReturnCode = m_slCamera.getPosition(m_slCameraPose, sl::REFERENCE_FRAME::WORLD);
519 }
520 // Check that the regular frame was retrieved successfully.
521 if (slPoseTrackReturnCode != sl::POSITIONAL_TRACKING_STATE::OK)
522 {
523 // Submit logger message.
524 LOG_WARNING(logging::g_qSharedLogger,
525 "Unable to retrieve new positional tracking pose for stereo camera {} ({})! sl::POSITIONAL_TRACKING_STATE is: {}",
526 sl::toString(m_slCameraModel).get(),
527 m_unCameraSerialNumber,
528 sl::toString(slPoseTrackReturnCode).get());
529 }
530 }
531
532 // Check if geo poses have been requested.
533 if (m_bCameraIsFusionMaster && m_bGeoPosesQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
534 {
535 // Get the fused geo pose from the camera.
536 sl::GNSS_FUSION_STATUS slGeoPoseTrackReturnCode = m_slFusionInstance.getGeoPose(m_slFusionGeoPose);
537 // Check that the geo pose was retrieved successfully.
538 if (slGeoPoseTrackReturnCode != sl::GNSS_FUSION_STATUS::OK && slGeoPoseTrackReturnCode != sl::GNSS_FUSION_STATUS::CALIBRATION_IN_PROGRESS)
539 {
540 // Submit logger message.
541 LOG_WARNING(logging::g_qSharedLogger,
542 "Geo pose tracking state for stereo camera {} ({}) is suboptimal! sl::GNSS_FUSION_STATUS is: {}",
543 sl::toString(m_slCameraModel).get(),
544 m_unCameraSerialNumber,
545 sl::toString(slGeoPoseTrackReturnCode).get());
546 }
547 }
548
549 // Check if floor planes are being requested.
550 if (m_bFloorsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
551 {
552 // Get the current pose of the camera.
553 slReturnCode = m_slCamera.findFloorPlane(m_slFloorPlane,
554 m_slFloorTrackingTransform,
555 m_slCameraPose.getTranslation().y,
556 m_slCameraPose.getRotationMatrix(),
557 m_fExpectedCameraHeightFromFloorTolerance);
558 // Check that the regular frame was retrieved successfully.
559 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
560 {
561 // Submit logger message.
562 LOG_WARNING(logging::g_qSharedLogger,
563 "Unable to retrieve new floor plane for stereo camera {} ({})! sl::ERROR_CODE is: {}",
564 sl::toString(m_slCameraModel).get(),
565 m_unCameraSerialNumber,
566 sl::toString(slReturnCode).get());
567 }
568 }
569 }
570
571 // Get the IMU, barometer, magnetometer, and temperature sensor info from the camera.
572 if (m_bSensorsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
573 {
574 // Get the sensor data from the camera.
575 slReturnCode = m_slCamera.getSensorsData(m_slSensorsData, sl::TIME_REFERENCE::CURRENT);
576
577 // Check if the sensor data was retrieved successfully.
578 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
579 {
580 // Submit logger message.
581 LOG_WARNING(logging::g_qSharedLogger,
582 "Unable to retrieve sensor data for stereo camera {} ({})! sl::ERROR_CODE is: {}",
583 sl::toString(m_slCameraModel).get(),
584 m_unCameraSerialNumber,
585 sl::toString(slReturnCode).get());
586 }
587 }
588
589 // Check if object detection is enabled.
590 if (m_slCamera.isObjectDetectionEnabled())
591 {
592 // Check if objects have been requested.
593 if (m_bObjectsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
594 {
595 // Get updated objects from camera.
596 slReturnCode = m_slCamera.retrieveObjects(m_slDetectedObjects);
597 // Check that the regular frame was retrieved successfully.
598 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
599 {
600 // Submit logger message.
601 LOG_WARNING(logging::g_qSharedLogger,
602 "Unable to retrieve new object data for stereo camera {} ({})! sl::ERROR_CODE is: {}",
603 sl::toString(m_slCameraModel).get(),
604 m_unCameraSerialNumber,
605 sl::toString(slReturnCode).get());
606 }
607 }
608
609 // Check if batched object data is enabled.
610 if (m_slObjectDetectionBatchParams.enable)
611 {
612 // Check if batched objects have been requested.
613 if (m_bBatchedObjectsQueued.load(ATOMIC_MEMORY_ORDER_METHOD))
614 {
615 // Get updated batched objects from camera.
616 slReturnCode = m_slCamera.getObjectsBatch(m_slDetectedObjectsBatched);
617 // Check that the regular frame was retrieved successfully.
618 if (slReturnCode != sl::ERROR_CODE::SUCCESS)
619 {
620 // Submit logger message.
621 LOG_WARNING(logging::g_qSharedLogger,
622 "Unable to retrieve new batched object data for stereo camera {} ({})! sl::ERROR_CODE is: {}",
623 sl::toString(m_slCameraModel).get(),
624 m_unCameraSerialNumber,
625 sl::toString(slReturnCode).get());
626 }
627 }
628 }
629 }
630 // Detection not enabled, but got requests.
631 else if (m_bObjectsQueued.load(std::memory_order_relaxed) || m_bBatchedObjectsQueued.load(std::memory_order_relaxed))
632 {
633 // Submit logger message.
634 LOG_WARNING(logging::g_qSharedLogger,
635 "Unable to retrieve new object data for stereo camera {} ({})! Object detection is disabled!",
636 sl::toString(m_slCameraModel).get(),
637 m_unCameraSerialNumber);
638 }
639 }
640 else
641 {
642 // Submit logger message.
643 LOG_ERROR(logging::g_qSharedLogger,
644 "Unable to update stereo camera {} ({}) frames, measurements, and sensors! sl::ERROR_CODE is: {}. Closing camera...",
645 sl::toString(m_slCameraModel).get(),
646 m_unCameraSerialNumber,
647 sl::toString(slReturnCode).get());
648
649 // Release camera resources.
650 m_slCamera.close();
651 }
652 }
653
654 // Acquire a shared_lock on the frame copy queue.
655 std::shared_lock<std::shared_mutex> lkSchedulers(m_muPoolScheduleMutex);
656 // Check if any requests have been made.
657 if (!m_qFrameCopySchedule.empty() || !m_qGPUFrameCopySchedule.empty() || !m_qCustomBoxIngestSchedule.empty() || !m_qPoseCopySchedule.empty() ||
658 !m_qGeoPoseCopySchedule.empty() || !m_qFloorCopySchedule.empty() || !m_qSensorsCopySchedule.empty() || !m_qObjectDataCopySchedule.empty() ||
659 !m_qObjectBatchedDataCopySchedule.empty())
660 {
661 // Add the length of all queues together to determine how many tasks need to be run.
662 size_t siTotalQueueLength = m_qFrameCopySchedule.size() + m_qGPUFrameCopySchedule.size() + m_qCustomBoxIngestSchedule.size() + m_qPoseCopySchedule.size() +
663 m_qGeoPoseCopySchedule.size() + m_qFloorCopySchedule.size() + m_qSensorsCopySchedule.size() + m_qObjectDataCopySchedule.size() +
664 m_qObjectBatchedDataCopySchedule.size();
665
666 // Start the thread pool to copy member variables to requesting other threads. Num of tasks queued depends on number of member variables updates and requests.
667 this->RunDetachedPool(siTotalQueueLength, m_nNumFrameRetrievalThreads);
668
669 // Static bool for keeping track of reset toggle action.
670 static bool bQueueTogglesAlreadyReset = false;
671 // Get current time.
672 std::chrono::_V2::system_clock::duration tmCurrentTime = std::chrono::high_resolution_clock::now().time_since_epoch();
673 // Only reset once every couple seconds.
674 if (std::chrono::duration_cast<std::chrono::seconds>(tmCurrentTime).count() % 31 == 0 && !bQueueTogglesAlreadyReset)
675 {
676 // Reset queue counters.
677 m_bNormalFramesQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
678 m_bDepthFramesQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
679 m_bPointCloudsQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
680 m_bPosesQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
681 m_bGeoPosesQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
682 m_bFloorsQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
683 m_bSensorsQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
684 m_bObjectsQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
685 m_bBatchedObjectsQueued.store(false, ATOMIC_MEMORY_ORDER_METHOD);
686
687 // Set reset toggle.
688 bQueueTogglesAlreadyReset = true;
689 }
690 // Crucial for toggle action. If time is not evenly devisable and toggles have previously been set, reset queue reset boolean.
691 else if (bQueueTogglesAlreadyReset)
692 {
693 // Reset reset toggle.
694 bQueueTogglesAlreadyReset = false;
695 }
696
697 // Wait for thread pool to finish.
698 this->JoinPool();
699 }
700
701 // Release lock on frame copy queue.
702 lkSchedulers.unlock();
703}
void RunDetachedPool(const unsigned int nNumTasksToQueue, const unsigned int nNumThreads=2, const bool bForceStopCurrentThreads=false)
When this method is called, it starts a thread pool full of threads that don't return std::futures (l...
Definition AutonomyThread.hpp:336
void JoinPool()
Waits for pool to finish executing tasks. This method will block the calling code until thread is fin...
Definition AutonomyThread.hpp:439
sl::ERROR_CODE EnableSpatialMapping() override
Enabled the spatial mapping feature of the camera. Pose tracking will be enabled if it is not already...
Definition ZEDCam.cpp:2080
sl::ERROR_CODE EnableObjectDetection(const bool bEnableBatching=false) override
Enables the object detection and tracking feature of the camera.
Definition ZEDCam.cpp:2162
Here is the call graph for this function:

◆ PooledLinearCode()

void ZEDCam::PooledLinearCode ( )
overrideprivatevirtual

This method holds the code that is ran in the thread pool started by the ThreadedLinearCode() method. It copies the data from the different data objects to references of the same type stored in a queue filled by the Request methods.

Author
ClayJay3 (clayt.nosp@m.onra.nosp@m.ycowe.nosp@m.n@gm.nosp@m.ail.c.nosp@m.om)
Date
2023-09-08

Implements AutonomyThread< void >.

716{
718 // Frame queue.
720 // Check if we are using CPU or GPU mats.
721 if (m_slMemoryType == sl::MEM::CPU)
722 {
723 // Acquire mutex for getting frames out of the queue.
724 std::unique_lock<std::shared_mutex> lkFrameQueue(m_muFrameCopyMutex);
725 // Check if the queue is empty.
726 if (!m_qFrameCopySchedule.empty())
727 {
728 // Get frame container out of queue.
729 containers::FrameFetchContainer<cv::Mat> stContainer = m_qFrameCopySchedule.front();
730 // Pop out of queue.
731 m_qFrameCopySchedule.pop();
732 // Release lock.
733 lkFrameQueue.unlock();
734
735 // Determine which frame should be copied.
736 switch (stContainer.eFrameType)
737 {
738 case PIXEL_FORMATS::eBGRA: *stContainer.pFrame = imgops::ConvertSLMatToCVMat(m_slFrame); break;
739 case PIXEL_FORMATS::eDepthMeasure: *stContainer.pFrame = imgops::ConvertSLMatToCVMat(m_slDepthMeasure); break;
740 case PIXEL_FORMATS::eDepthImage: *stContainer.pFrame = imgops::ConvertSLMatToCVMat(m_slDepthImage); break;
741 case PIXEL_FORMATS::eXYZBGRA: *stContainer.pFrame = imgops::ConvertSLMatToCVMat(m_slPointCloud); break;
742 default: *stContainer.pFrame = imgops::ConvertSLMatToCVMat(m_slFrame); break;
743 }
744
745 // Signal future that the frame has been successfully retrieved.
746 stContainer.pCopiedFrameStatus->set_value(true);
747 }
748
749 // Check if anything has been added to the GPU queue.
750 if (!m_qGPUFrameCopySchedule.empty())
751 {
752 // Submit logger error.
753 LOG_ERROR(logging::g_qSharedLogger,
754 "ZEDCam ({}) is in CPU sl::Mat mode but a GPU mat has been added to the copy queue! Whichever thread queued the frame will now appear frozen if "
755 "future.get() is called. Either switch the camera to GPU Mat mode in AutonomyConstants.h or stop queueing frames of type cv::Mat.",
756 m_unCameraSerialNumber);
757 }
758 }
759 // Use GPU mat.
760 else
761 {
762 // Acquire mutex for getting frames out of the queue.
763 std::unique_lock<std::shared_mutex> lkFrameQueue(m_muFrameCopyMutex);
764 // Check if the queue is empty.
765 if (!m_qGPUFrameCopySchedule.empty())
766 {
767 // Get frame container out of queue.
768 containers::FrameFetchContainer<cv::cuda::GpuMat> stContainer = m_qGPUFrameCopySchedule.front();
769 // Pop out of queue.
770 m_qGPUFrameCopySchedule.pop();
771 // Release lock.
772 lkFrameQueue.unlock();
773
774 // Determine which frame should be copied.
775 switch (stContainer.eFrameType)
776 {
777 case PIXEL_FORMATS::eBGRA: *stContainer.pFrame = imgops::ConvertSLMatToGPUMat(m_slFrame); break;
778 case PIXEL_FORMATS::eDepthMeasure: *stContainer.pFrame = imgops::ConvertSLMatToGPUMat(m_slDepthMeasure); break;
779 case PIXEL_FORMATS::eDepthImage: *stContainer.pFrame = imgops::ConvertSLMatToGPUMat(m_slDepthImage); break;
780 case PIXEL_FORMATS::eXYZBGRA: *stContainer.pFrame = imgops::ConvertSLMatToGPUMat(m_slPointCloud); break;
781 default: *stContainer.pFrame = imgops::ConvertSLMatToGPUMat(m_slFrame); break;
782 }
783
784 // Signal future that the frame has been successfully retrieved.
785 stContainer.pCopiedFrameStatus->set_value(true);
786 }
787
788 // Check if anything has been added to the GPU queue.
789 if (!m_qFrameCopySchedule.empty())
790 {
791 // Submit logger error.
792 LOG_ERROR(logging::g_qSharedLogger,
793 "ZEDCam ({}) is in GPU sl::Mat mode but a CPU mat has been added to the copy queue! Whichever thread queued the frame will now appear frozen if "
794 "future.get() is called. Either switch the camera to GPU Mat mode in AutonomyConstants.h or stop queueing frames of type cv::cuda::GpuMat.",
795 m_unCameraSerialNumber);
796 }
797 }
798
800 // Pose queue.
802 // Acquire mutex for getting data out of the pose queue.
803 std::unique_lock<std::shared_mutex> lkPoseQueue(m_muPoseCopyMutex);
804 // Check if the queue is empty.
805 if (!m_qPoseCopySchedule.empty())
806 {
807 // Get pose container out of queue.
808 containers::DataFetchContainer<Pose> stContainer = m_qPoseCopySchedule.front();
809 // Pop out of queue.
810 m_qPoseCopySchedule.pop();
811 // Release lock.
812 lkPoseQueue.unlock();
813
814 // Rotate the ZED position coordinate frame to realign with the UTM global coordinate frame.
815 std::vector<numops::CoordinatePoint<double>> vPointCloud;
816 vPointCloud.emplace_back(m_slCameraPose.getTranslation().x, m_slCameraPose.getTranslation().y, m_slCameraPose.getTranslation().z);
817 // Get angle realignments.
818 double dNewXO = numops::InputAngleModulus<double>(m_slCameraPose.getEulerAngles(false).x + m_dPoseOffsetXO, 0.0, 360.0);
819 double dNewYO = numops::InputAngleModulus<double>(m_slCameraPose.getEulerAngles(false).y + m_dPoseOffsetYO, 0.0, 360.0);
820 double dNewZO = numops::InputAngleModulus<double>(m_slCameraPose.getEulerAngles(false).z + m_dPoseOffsetZO, 0.0, 360.0);
821 // Rotate coordinate frame.
822 numops::CoordinateFrameRotate3D(vPointCloud, m_dPoseOffsetXO, m_dPoseOffsetYO, m_dPoseOffsetZO);
823 // Repack values into pose.
824 Pose stPose(vPointCloud[0].tX + m_dPoseOffsetX, vPointCloud[0].tY + m_dPoseOffsetY, vPointCloud[0].tZ + m_dPoseOffsetZ, dNewXO, dNewYO, dNewZO);
825
826 // ISSUE NOTE: Might be in the future if we ever change our coordinate system on the ZED. This can be used to fix the directions of the Pose's coordinate system.
827 // // Check ZED coordinate system.
828 // switch (m_slCameraParams.coordinate_system)
829 // {
830 // case sl::COORDINATE_SYSTEM::LEFT_HANDED_Y_UP:
831 // {
832 // // Realign based in the signedness of this coordinate system. Z is backwards.
833 // stPose.stTranslation.dZ *= -1;
834 // break;
835 // }
836 // default:
837 // {
838 // // No need to flip signs for other coordinate systems.
839 // break;
840 // }
841 // }
842
843 // Copy pose.
844 *stContainer.pData = stPose;
845
846 // Signal future that the data has been successfully retrieved.
847 stContainer.pCopiedDataStatus->set_value(true);
848 }
849 else
850 {
851 // Release lock.
852 lkPoseQueue.unlock();
853 }
854
856 // GeoPose queue.
858 // Acquire mutex for getting data out of the pose queue.
859 std::unique_lock<std::shared_mutex> lkGeoPoseQueue(m_muGeoPoseCopyMutex);
860 // Check if the queue is empty.
861 if (!m_qGeoPoseCopySchedule.empty())
862 {
863 // Get pose container out of queue.
864 containers::DataFetchContainer<sl::GeoPose> stContainer = m_qGeoPoseCopySchedule.front();
865 // Pop out of queue.
866 m_qGeoPoseCopySchedule.pop();
867 // Release lock.
868 lkGeoPoseQueue.unlock();
869
870 // Copy pose.
871 *stContainer.pData = sl::GeoPose(m_slFusionGeoPose);
872
873 // Signal future that the data has been successfully retrieved.
874 stContainer.pCopiedDataStatus->set_value(true);
875 }
876 else
877 {
878 // Release lock.
879 lkGeoPoseQueue.unlock();
880 }
881
883 // Plane queue.
885 // Acquire mutex for getting frames out of the plane queue.
886 std::unique_lock<std::shared_mutex> lkPlaneQueue(m_muFloorCopyMutex);
887 // Check if the queue is empty.
888 if (!m_qFloorCopySchedule.empty())
889 {
890 // Get frame container out of queue.
891 containers::DataFetchContainer<sl::Plane> stContainer = m_qFloorCopySchedule.front();
892 // Pop out of queue.
893 m_qFloorCopySchedule.pop();
894 // Release lock.
895 lkPlaneQueue.unlock();
896
897 // Copy pose.
898 *stContainer.pData = sl::Plane(m_slFloorPlane);
899
900 // Signal future that the data has been successfully retrieved.
901 stContainer.pCopiedDataStatus->set_value(true);
902 }
903 else
904 {
905 // Release lock.
906 lkPlaneQueue.unlock();
907 }
908
910 // Sensors queue.
912 // Acquire mutex for getting frames out of the sensors queue.
913 std::unique_lock<std::shared_mutex> lkSensorsQueue(m_muSensorsCopyMutex);
914 // Check if the queue is empty.
915 if (!m_qSensorsCopySchedule.empty())
916 {
917 // Get frame container out of queue.
918 containers::DataFetchContainer<sl::SensorsData> stContainer = m_qSensorsCopySchedule.front();
919 // Pop out of queue.
920 m_qSensorsCopySchedule.pop();
921 // Release lock.
922 lkSensorsQueue.unlock();
923
924 // Copy pose.
925 *stContainer.pData = sl::SensorsData(m_slSensorsData);
926
927 // Signal future that the data had been successfully retrieved.
928 stContainer.pCopiedDataStatus->set_value(true);
929 }
930 else
931 {
932 // Release lock.
933 lkSensorsQueue.unlock();
934 }
935
937 // ObjectData queue.
939 // Acquire mutex for getting data out of the pose queue.
940 std::unique_lock<std::shared_mutex> lkObjectDataQueue(m_muObjectDataCopyMutex);
941 // Check if the queue is empty.
942 if (!m_qObjectDataCopySchedule.empty())
943 {
944 // Get frame container out of queue.
945 containers::DataFetchContainer<std::vector<sl::ObjectData>> stContainer = m_qObjectDataCopySchedule.front();
946 // Pop out of queue.
947 m_qObjectDataCopySchedule.pop();
948 // Release lock.
949 lkObjectDataQueue.unlock();
950
951 // Make copy of object vector. (Apparently the assignment operator actually does a deep copy)
952 *stContainer.pData = m_slDetectedObjects.object_list;
953
954 // Signal future that the data has been successfully retrieved.
955 stContainer.pCopiedDataStatus->set_value(true);
956 }
957 else
958 {
959 // Release lock.
960 lkObjectDataQueue.unlock();
961 }
962
964 // ObjectData Batched queue.
966 // Acquire mutex for getting data out of the pose queue.
967 std::unique_lock<std::shared_mutex> lkObjectBatchedDataQueue(m_muObjectBatchedDataCopyMutex);
968 // Check if the queue is empty.
969 if (!m_qObjectBatchedDataCopySchedule.empty())
970 {
971 // Get frame container out of queue.
972 containers::DataFetchContainer<std::vector<sl::ObjectsBatch>> stContainer = m_qObjectBatchedDataCopySchedule.front();
973 // Pop out of queue.
974 m_qObjectBatchedDataCopySchedule.pop();
975 // Release lock.
976 lkObjectBatchedDataQueue.unlock();
977
978 // Make copy of object vector. (Apparently the assignment operator actually does a deep copy)
979 *stContainer.pData = m_slDetectedObjectsBatched;
980
981 // Signal future that the data has been successfully retrieved.
982 stContainer.pCopiedDataStatus->set_value(true);
983 }
984 else
985 {
986 // Release lock.
987 lkObjectBatchedDataQueue.unlock();
988 }
989}
cv::cuda::GpuMat ConvertSLMatToGPUMat(sl::Mat &slInputMat)
Convert a ZEDSDK sl::Mat object into an OpenCV cv::cuda::GpuMat object. Keeps all Mat memory in GPU V...
Definition ImageOperations.hpp:127
cv::Mat ConvertSLMatToCVMat(sl::Mat &slInputMat)
Convert a ZEDSDK sl::Mat object into an OpenCV cv::Mat object. This copies the mat from GPU memory to...
Definition ImageOperations.hpp:106
constexpr void CoordinateFrameRotate3D(std::vector< CoordinatePoint< T > > &vPointCloud, const double dXRotationDegrees, const double dYRotationDegrees, const double dZRotationDegrees)
This method will rotate a list of 3D coordinate points a variable amount of degrees around the X,...
Definition NumberOperations.hpp:269
Here is the call graph for this function:

The documentation for this class was generated from the following files: