FOD BOT

now browsing by category

 

The sensor testbed is built

It has begun.  I have started coding in RobotC for the first time ever tonight.  WOO HOO!!!

I would have started programming on Thursday, but a vex windows 7, 64bit driver issue took me a few days and a second computer to solve.

Regardless, I have starting going through the robot C tutorial videos. They have been extremely useful and I am getting the ideas I need to create the basic sensor package and drive code branches.  I know I will need separate branches of code so while I am going through these tutorials I am trying to take notes on which knowledge should go in each branch. Here is a list of branches I think I will need to end up with.

  • Initiation branch – start up, systems check
  • Calibration branch – calibrate sensors, controls, battery, and other items
  • Communication branch – establish and maintains communications with docks
  • Self-locating branch – telemetry, slam, mapping, data fusion and checking
  • Database branch – create a database that the bot can store and access to perform variety of functions
  • Log and error branch – need to track lots of variables for error handling, improvements, maintenance and black box safety items
  • Drive branch – all of the code drive the bot
  • Sensor branch – all of the code to govern all of the sensors on the bot
  • Vision branch – all of the code to process vision sensor data
  • Mapping branch – all of the code to create, calibrate, update the map of the space
  • Drive route branch – all of the code to consume the map and create a set of drive routes that take into account previous dirt levels, battery charge, drive times, and overlap
  • Vacuum branch – this controls the vacuum and agitators
  • Docking branch – this will be the location of the docking code and contain things like orientation, drive sequence, communication and sensor connections for use when docking
  • Maintenance branch – this is where we will store specific routines that will enable easy inspection and trouble shooting of mechanical systems, sensor hardware and other physical components
  • Safety branch – this is where we will control obstacle avoidance, the lights, sounds and other environmental safety hazard code type items

This is just the list I have been able to come up with so far.  I am still at the driving strait with encoders level of code so I have a long way to go.  But after managing software development at the CATT Lab I am all too familiar with keeping code organized, files under 300 lines, good code documentation and maintaining discipline for version management.

 

Parts are on their way

FOD BOT I project update.

I am going to use VEX parts for the first example since they are easily assembled and there is a big developer community.  I am also going to work with the Arduino chip set to see how the various vex sensors work with each.  Since I am a new programmer I am looking to do mostly learning in the first few months.  However, I also asked a few of the Walt Whitman High School Robotics kids (FRC 1389) if they were interested in helping and 6 have responded so far.  If they are willing to write the code than the project may accelerate much quicker than I am currently thinking.

 

Here is a rough plan for the first part of the project

  1. Assemble vex chassis
  2. Create code to control drive train
  3. Test drive train code
  4. Integrate, code and test vex sensors one at a time
    1. Encoders
    2. Pressure switches
    3. Limit switches
    4. Ultrasonic switches
    5. IR range finders
      1. Vex currently does not have IR sensors to I am using some of the small SHARP IR sensors
        1. GP2Y0D810Z0F Digital Distance Sensor 10cm
        2. GP2Y0D805Z0F Digital Distance Sensor 5cm
        3. GP2Y0A41SK0F Analog Distance Sensor 4-30cm
        4. GP2Y0A21YK0F Analog Distance Sensor 10-80cm
        5. GP2Y0A02YK0F Analog Distance Sensor 20-150cm
    6. Accelerometers
    7. Single axis gyro
    8. 3 axis gyro
      1. Vex does not have 3 axis gyro, instead I will use an Arduino 3 axis gyro chip once I get there I will update the plan
    9. GPS
      1. Vex does not have GPS, instead I will use an Arduino GPS chip once I get there I will update the plan
  5. Integrate, code, test human awareness elements
    1. Lights
    2. Speakers

 

Some of the questions I am working on answering are about self-locating.  I know this is putting the lamb before the cart a bit, but, it will be the next phase of the project so I am trying to get ready for it. SLAM, GPS, WIFI or some other form of self-locating in relation to the base and the environment it finds itself in are some of the ideas I am looking at.  I will keep you updated as I get more info.

FOD BOT 1 project is a go

I am embarking a new journey that I hope will lead to some fun, some money and a lot of learning.  I will be building and programming a robot.  The building part is not that hard, I have done that several times with FIRST teams in Philly and Washington DC.  It is the programming part that is going to be the challenge for me.

 

I want to build a robot named the FOD BOT I.  The FOD BOT 1 is an OSHA approved Roomba on steroids capable of operating autonomously 24 hours a day to prevent Foreign Object Debris (FOD) in production environment. It is an idea I came up with a few years ago that I think still has legs.  If you want to learn more about this project check out this video.

I will add the kick starter info once it is approved.

I am also working on combining my work at the CATT Lab with my love for robots.  The CATT Lab is working to be the NOAA of transit and is collecting transportation data from the entire country on a daily basis to serving it back to customers in 26 states in both real time and access to the archive.  One of the challenges facing the lab is the growth of connected cars (cars that talk to each other) and autonomous cars (cars that drive themselves).  Both of these types of vehicles will be changing the way we drive and how transportation data is collected. Connected cars hit the roads as early as next year and autonomous cars might hit the road as early as 2020.  But if you look at cars as robotic systems they have a lot of similarity to the changes taking place in aviation right now as the FAA looks to create the legal framework for Unmanned Arial Systems (UASs). These are just autonomous cars that can fly, the problem is they are already in the airspace being used by farmers, wedding photographers, real estate agents, news organizations and many others. This is forcing the FAA to rapidly react, leaving lots of room for recommendation.  That is where I hope to fit in, I am going to write a series of white papers based on capturing black box data from unmanned systems as they apply to ground and air based application in hopes of guiding the US Department of Transportation (USDOT) to create a universal policy.  I will update you as it goes.