这是PCL(点云库)的具体问题。我正在尝试显示来自Kinect的实时数据流的范围图像。我已经能够从Kinect读取点云并显示点云。然而,生成范围图像非常痛苦。我已经阅读了PCL文档中的示例,并使用了相同的代码。 生成的距离图像非常小并且都是NAN。 showRangeImage函数似乎挂起(因此我已将此调用注释掉)。 PCL示例中的3D查看器代码在vtkOutputWindow中出现错误。
我不确定createFromPointCloud函数的sensorPose输入应该是什么。我目前正在使用
Eigen::Affine3f sensorPose = (Eigen::Affine3f)Eigen::Translation3f(0.0f, 0.0f, 0.0f);
我不知道是否会导致错误。
我还尝试从磁盘上的pcd文件中读取(相关代码位于下面程序的else块中)。我试图在PCL文档中使用相同的代码和示例。从磁盘读取pcd文件似乎也不起作用。
任何建议都将不胜感激。
#include<pcl/point_cloud.h>
#include<pcl/point_types.h>
#include<pcl/io/openni_grabber.h>
#include<pcl/common/time.h>
#include<pcl/visualization/pcl_visualizer.h>
#include<pcl/point_types.h>
#include<pcl/visualization/cloud_viewer.h>
#include<pcl/range_image/range_image.h>
#include<pcl/visualization/range_image_visualizer.h>
#include<pcl/io/pcd_io.h>
typedef pcl::PointXYZRGBA PointT;
typedef pcl::PointXYZ PointType;
pcl::visualization::CloudViewer viewer("PCL Viewer");
pcl::visualization::RangeImageVisualizer range_image_widget("Range image");
pcl::visualization::PCLVisualizer viewer3D ("3D Viewer");
//From PCL documentation example code
void
setViewerPose (pcl::visualization::PCLVisualizer& viewer, const Eigen::Affine3f& viewer_pose)
{
Eigen::Vector3f pos_vector = viewer_pose * Eigen::Vector3f(0, 0, 0);
Eigen::Vector3f look_at_vector = viewer_pose.rotation () * Eigen::Vector3f(0, 0, 1) + pos_vector;
Eigen::Vector3f up_vector = viewer_pose.rotation () * Eigen::Vector3f(0, -1, 0);
viewer.setCameraPose (pos_vector[0], pos_vector[1], pos_vector[2],
look_at_vector[0], look_at_vector[1], look_at_vector[2],
up_vector[0], up_vector[1], up_vector[2]);
}
//Call back method
void cloud_cb_ (const pcl::PointCloud<pcl::PointXYZRGBA>::ConstPtr &cloud)
{
static unsigned count = 0;
static double last = pcl::getTime ();
if (++count == 30)
{
double now = pcl::getTime ();
std::cout << "distance of center pixel :" << cloud->points [(cloud->width >> 1) * (cloud->height + 1)].z << " mm. Average framerate: " << double(count)/double(now - last) << " Hz" << std::endl;
count = 0;
last = now;
}
// PCL viewer //
// Display pointcloud:
viewer.showCloud (cloud);
//DO USEFUL OPERATIONS HERE
//Create a range image and display it
// We now want to create a range image from the above point cloud, with a 1deg angular resolution
float angularResolution = (float) ( 1.0f * (M_PI/180.0f)); // 1.0 degree in radians
float maxAngleWidth = (float) (360.0f * (M_PI/180.0f)); // 360.0 degree in radians
float maxAngleHeight = (float) (180.0f * (M_PI/180.0f)); // 180.0 degree in radians
Eigen::Affine3f sensorPose = (Eigen::Affine3f)Eigen::Translation3f(0.0f, 0.0f, 0.0f);
pcl::RangeImage::CoordinateFrame coordinate_frame = pcl::RangeImage::CAMERA_FRAME;
float noiseLevel=0.00;
float minRange = 0.0f;
int borderSize = 1;
boost::shared_ptr<pcl::RangeImage> range_image_ptr(new pcl::RangeImage);
pcl::RangeImage& rangeImage = *range_image_ptr;
//Range image for live stream from Kinect
if(1){
rangeImage.createFromPointCloud(*cloud, angularResolution, maxAngleWidth, maxAngleHeight,
sensorPose, coordinate_frame, noiseLevel, minRange, borderSize);
}
//Alternative test - from PCD file on disk
else{
pcl::PointCloud<PointType>::Ptr point_cloud_ptr (new pcl::PointCloud<PointType>);
pcl::PointCloud<PointType>& point_cloud = *point_cloud_ptr;
//pcd file from http://download.ros.org/data/pcl/
if (pcl::io::loadPCDFile ("src\\office_scene.pcd", point_cloud) == -1)
std::cout<<"Cannot load scene file\n";
Eigen::Affine3f scene_sensor_pose (Eigen::Affine3f::Identity ());
scene_sensor_pose = Eigen::Affine3f (Eigen::Translation3f (point_cloud.sensor_origin_[0],
point_cloud.sensor_origin_[1],
point_cloud.sensor_origin_[2])) *
Eigen::Affine3f (point_cloud.sensor_orientation_);
rangeImage.createFromPointCloud(point_cloud, angularResolution, maxAngleWidth, maxAngleHeight,
scene_sensor_pose, coordinate_frame, noiseLevel, minRange, borderSize);
}
std::cout << rangeImage << "\n";
//showRangeImage seems to take a very long time (infinite loop?). Hence commented out
//range_image_widget.showRangeImage(rangeImage);
//viewer3D gives error
viewer3D.setBackgroundColor (1, 1, 1);
pcl::visualization::PointCloudColorHandlerCustom<pcl::PointWithRange> range_image_color_handler (range_image_ptr, 0, 0, 0);
viewer3D.addPointCloud (range_image_ptr, range_image_color_handler, "range image");
viewer3D.setPointCloudRenderingProperties (pcl::visualization::PCL_VISUALIZER_POINT_SIZE, 1, "range image");
viewer3D.initCameraParameters ();
setViewerPose(viewer3D, rangeImage.getTransformationToWorldSystem ());
}
int main ()
{
// create a new grabber for OpenNI devices
pcl::Grabber* interface = new pcl::OpenNIGrabber();
// make callback function from member function
boost::function<void (const pcl::PointCloud<pcl::PointXYZRGBA>::ConstPtr&)> f =
boost::bind (&cloud_cb_, _1);
// connect callback function for desired signal. In this case its a point cloud with color values
boost::signals2::connection c = interface->registerCallback (f);
// start receiving point clouds
interface->start ();
// wait until user quits program with Ctrl-C, but no busy-waiting -> sleep (1);
while (true)
boost::this_thread::sleep (boost::posix_time::seconds (1));
// stop the grabber
interface->stop ();
return (0);
}`
答案 0 :(得分:2)
mnut,我找到了解决问题的方法。因此,您获得的距离图像太小,因为您设置了错误的角度参数。正确的方法是在下面两行使用Kinect传感器规格的规格(参考https://msdn.microsoft.com/en-us/library/jj131033.aspx):
float maxAngleWidth = (float) (57.0f * (M_PI / 180.0f));
float maxAngleHeight = (float) (43.0f * (M_PI / 180.0f));
float angularResolution = (float)(57.0f / 640.0f * (M_PI/180.0f));
数字57和43是视野的视角,因此角度分辨率是每度数的像素数量,因此你应该将57除以640或43除以480.
对于sensorPose,我使用了以下值(默认值):
Eigen::Affine3f sensorPose = Eigen::Affine3f::Identity();
最后,为了显示生成的范围图像,请使用以下命令:
range_image_widget.showRangeImage(rangeImage);
range_image_widget.spinOnce(); // This line was missing
但是你得到的结果远非令人满意,因为帧速率会非常小。请参阅以下视频以进行演示:http://youtu.be/iao3BeI4LqM
并且viewer3D不适合显示范围图像。