-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to launch qsr and Velocity Costmaps #3
Comments
Hi, Always happy when someone reads the papers ;) In theory it should be as easy as launching the
Currently the system relies on the output produced by our people perception module. We have a deployment of our robot in the end of November where the system is supposed to be used outside the lab for the first time. For this we will collect more training data (currently only the two scenarios shown in the video are available) and also make it easier to use. However, if you just run the launch file above and use the tracker we provide it should work given that you are using the modified navigation stack. If you are interested, I can keep you in the loop regarding those upgrades that should hopefully make it more useful. |
Hi, I got progress.^_^ My robot is kobuki-like robot.It only receives twists (cmd_vel)from navigation stack,and converts two wheels encoder data to odom. The LRF is hokuyo UST20 laser.The RGB-D camera is xtion(I didn't use it in gazebo but my real robot has one ).xtion is used to detect object. I read almost all your lab papers, now focus on the AAAI2016 paper(Unsupervised Learning of Qualitative Motion Behaviours by a Mobile Robot).Just Little confused about how the learning model works in detail. I have three questions at the present stage: I'm looking forward to haering from you. |
Hi, The problem with the velocity costmap is that it is only published when a human is detected and the interaction type has been classified. I will create a simulator set-up using our robot and simulator at some point rather soon-ish and make that available for testing as well. Regarding mongodb the people perception (given that you do not want to use the logging feature) and the velocity costmaps work without the mongodb. For most other parts of the system you will need it though. It doesn't matter if it is just a local network. The mongodb is a database that we use to store all kinds of important information like topological maps and for data collection. |
hi Dondrup, I'm still working on it right now. step1: Install ros-indigo-ros-base (Ubuntu 14.04) At least I can install compile th navigation by following these steps. |
Hi
I read the project Qualitative Constraints for Human-aware Robot Navigation and watched the video.I'm very interest about this project and want to test on my own robot.
I've installed all the packages include the modified navigation meta pack,but I don't know how to launch this project.
Could you tell me the steps to start this project?
http://lcas.lincoln.ac.uk/cdondrup/software.html
The text was updated successfully, but these errors were encountered: