technical: Directed to those in CS/Robotics/etc. General reader may still keep up.
<< previous post <<
In the previous post, I went over my layered design approach for porting Geoff Nagy's (Active Recruitment Framework) to the ROS environment.
First Iteration Design
This figure outlines the black box design of a single robot running on the arc platform.
Input
To the left of the diagram are sensory inputs: Each gray box represents a ROS node, that is specified by whoever is using this framework.Odometry source: The position information of where the robot is. This is usually gathered from motion sensors that estimate the displacement of a robot based on wheel movement.
laser source: Data from a laser scanner.
sensor transforms: These are interesting. They let the robot determine it's relative coordinate space, compared to the environment it is in, and the location of it's sensors. For example, when the laser reads in data, we need to know WHERE the laser is positioned at that time. When we check position information through the optometry node, what does this position represent? It's our POSITION, but relative to what? The sensor transforms bring the environment information into our local coordinate space, and will allow it to communicate it's information to other robots easily.
Output
base_controller: This handles the robot control. It receives velocity information that directs the robot how to move.
Tasks: The
robot will also output task signals that will be used to communicate
with other agents in the environment.Likewise the robot will need a way
to receive signals.
I have not decided exactly on what these signals will contain.
Lastly, will be a global monitoring module to gather statistics across
all active robots in the environment.
Basic Implementation
I wrote some integration tests using rostest. These ensure each topic is being published on with a required frequency range. For example, one test checks that there is odometry info being published at 10+-0.05 hz.
Let's take a closer look inside
The inner modules include visualization, which will use rviz and Stage to present graphical insights behind the framework.
The behavior module will manage planning, mapping, and other aspects of navigation.
The core "arc" component will be elaborated on soon, and will contain the main thesis work.
Up next >> Behaviour Module Design and Implementation >>
Hey Kyle,
ReplyDeleteI've been following your blog for the last few months (very interesting!), and I was wondering if I could borrow some wisdom from you.
I'm a second year Engineering student at the UofM who will changing majors to the Statistics and CS program next fall (currently playing 'catch-up'), and I was hoping to perhaps join the AALab once I've attained enough background knowledge to be considered an asset rather than a liability to the lab.
If you've got time I was wondering if I could ask you some questions about what would be considered prerequisite knowledge to join the lab. My email is: 'minc33@gmail.com' if you're willing.
Keep up the good work!
-Josh
Hey there Josh. My apologies, I just saw this comment now (guess I need to pay more attention). I'd love to answer some questions, expect an email from me shortly.
Deletealso note to self: setup blog comment notifications, so I don't miss comments like this.
-Kyle