Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Low Priority] Command Center UI #5

Open
SwapnilPande opened this issue Jan 12, 2019 · 2 comments
Open

[Low Priority] Command Center UI #5

SwapnilPande opened this issue Jan 12, 2019 · 2 comments
Assignees

Comments

@SwapnilPande
Copy link
Member

Which section of robot code is this for?
Command Center

Description of feature

  • Determine what data needs to be presented to the human drivers such as:
    • Planned autonomy route
    • Map of obstacles on the field
    • Current estimated location of the robot
    • High level controller state
  • Create intuitive UI to present all data

Additional Notes
Need to balance providing user data with minimizing bandwidth usage. Only present data that is absolutely necessary for the drivers to determine if the robot is functioning properly.

@partlygloudy
Copy link

I've partially implemented this in the mission control package. I've got an html page which displays a map of the competition area with the robot drawn in its estimated location. There's a table in the middle of the page which can display any data we want from the robot. The page is updated using a bit of javascript which opens a websocket connection to the rosbridge node (which also must be running), allowing it to subscribe to any ROS topics. This makes use of the roslibjs javascript library.

I was thinking that once we decide what data we want to observe, we can make versions of each of those topics called "updates/[topic]" which publish at a much slower rate than the topic used by the other robot nodes (1Hz maybe). For example "robot_pose_estimate" will be published very fast on the robot but we could also publish the message on "updates/robot_pose_estimate" at 1 Hz which would still be plenty fast to know if the robot is behaving properly.

@partlygloudy partlygloudy self-assigned this Jan 17, 2019
@partlygloudy
Copy link

We'll also want a second version of this for if we switch back to teleop control. If we did that, we'd need different sorts of information such as images from the cameras but could also potentially turn off some of the other information streams.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants