now browsing by tag
FOD BOT I project update.
I am going to use VEX parts for the first example since they are easily assembled and there is a big developer community. I am also going to work with the Arduino chip set to see how the various vex sensors work with each. Since I am a new programmer I am looking to do mostly learning in the first few months. However, I also asked a few of the Walt Whitman High School Robotics kids (FRC 1389) if they were interested in helping and 6 have responded so far. If they are willing to write the code than the project may accelerate much quicker than I am currently thinking.
Here is a rough plan for the first part of the project
- Assemble vex chassis
- Create code to control drive train
- Test drive train code
- Integrate, code and test vex sensors one at a time
- Pressure switches
- Limit switches
- Ultrasonic switches
- IR range finders
- Vex currently does not have IR sensors to I am using some of the small SHARP IR sensors
- GP2Y0D810Z0F Digital Distance Sensor 10cm
- GP2Y0D805Z0F Digital Distance Sensor 5cm
- GP2Y0A41SK0F Analog Distance Sensor 4-30cm
- GP2Y0A21YK0F Analog Distance Sensor 10-80cm
- GP2Y0A02YK0F Analog Distance Sensor 20-150cm
- Vex currently does not have IR sensors to I am using some of the small SHARP IR sensors
- Single axis gyro
- 3 axis gyro
- Integrate, code, test human awareness elements
Some of the questions I am working on answering are about self-locating. I know this is putting the lamb before the cart a bit, but, it will be the next phase of the project so I am trying to get ready for it. SLAM, GPS, WIFI or some other form of self-locating in relation to the base and the environment it finds itself in are some of the ideas I am looking at. I will keep you updated as I get more info.
I am embarking a new journey that I hope will lead to some fun, some money and a lot of learning. I will be building and programming a robot. The building part is not that hard, I have done that several times with FIRST teams in Philly and Washington DC. It is the programming part that is going to be the challenge for me.
I want to build a robot named the FOD BOT I. The FOD BOT 1 is an OSHA approved Roomba on steroids capable of operating autonomously 24 hours a day to prevent Foreign Object Debris (FOD) in production environment. It is an idea I came up with a few years ago that I think still has legs. If you want to learn more about this project check out this video.
I will add the kick starter info once it is approved.
I am also working on combining my work at the CATT Lab with my love for robots. The CATT Lab is working to be the NOAA of transit and is collecting transportation data from the entire country on a daily basis to serving it back to customers in 26 states in both real time and access to the archive. One of the challenges facing the lab is the growth of connected cars (cars that talk to each other) and autonomous cars (cars that drive themselves). Both of these types of vehicles will be changing the way we drive and how transportation data is collected. Connected cars hit the roads as early as next year and autonomous cars might hit the road as early as 2020. But if you look at cars as robotic systems they have a lot of similarity to the changes taking place in aviation right now as the FAA looks to create the legal framework for Unmanned Arial Systems (UASs). These are just autonomous cars that can fly, the problem is they are already in the airspace being used by farmers, wedding photographers, real estate agents, news organizations and many others. This is forcing the FAA to rapidly react, leaving lots of room for recommendation. That is where I hope to fit in, I am going to write a series of white papers based on capturing black box data from unmanned systems as they apply to ground and air based application in hopes of guiding the US Department of Transportation (USDOT) to create a universal policy. I will update you as it goes.
I read an article last week on quantum robots. The article was a bit confusing. So here is my own research on the topic. Hope it helps you understand how he gets to his conclusion that quantum robots are faster, more accurate, and are able to multitask better than the standard robot.
Definition of a robot:
- A machine capable of carrying out a complex series of actions automatically.
- (esp. in science fiction) A machine resembling a human being and able to replicate certain human movements and functions.
- Images of robots
All robots use computers to consume information about their environment and act upon those inputs according to the code they are loaded with.
Definition of a computer:
- An electronic device for storing and processing data, typically in binary form, according to instructions given to it in a variable program.
- A person who makes calculations, esp. with a calculating machine
Definition of computer code:
- the symbolic arrangement of data or instructions in a computer program or the set of such instructions.
The robot’s code is usually written for the capabilities of the CPU (central processing unit), meaning that the code only asks the hardware for answers so many times per second. The hardware speed plus the complexity of the math problems in the code determines the reaction speed of the robot and the number of things it can do at any one time. That last sentence is a bit misleading it assumes that power is not a limitation, meaning that the robot has more than enough electricity to allow the CPU, the robot’s sensors and the robot’s mechanisms to operate simultaneously at max speed.
As robots are created to do more things at a single time that are increasing in complexity the computer controlling the robot starts to become the limiting component. So in response robticists are adding more computing capability to their robots. That increasing capability requires more computer volume/weight, more structure to hold the computer and more power to maintain maximum computing capability.
With that understanding you can now ask the question; if robots could increase their computational capability without additional weight or power demands than could robots become more capable in the future? The answer is yes and that is where quantum computing comes in.
Definition of quantum computer:
- A computer that makes use of the quantum states of subatomic particles to store information.
Quantum computers are pretty cool. And Lockheed martin just announced that they have the first quantum computer ready for testing. If they get this computer working they will change the world. Quantum computing has the potential to render all of the world’s current cyber security useless. This article does a great job of describing the movie Sneakers…..I mean the future if quantum computing is real.
To go into what a quantum computer actually is just a waste of time when it comes to understanding what a quantum robot is in my opinion. The end result is that quantum computers are just computers that can handle significantly more calculation for the same weight and power consumption as a comparably sized computer. Meaning my robot can get a significant computer and code upgrade without any impact to the system or structures of the robot. Not as cool as what I thought quantum robot were before I started reading the article but, still exciting none the less. Let’s get the quantum computer working first and then worry about the quantum robotics later.
I really like the Roomba. It is the first real robot that started to be accepted as part of daily life. The folks behind the Roomba are trying to make a similar stride forward in the workplace with their next creation, Baxter. Baxter is a semi fixed robot that can be easily compared to legacy production robots with one key difference. In addition to being able to be programmed in a more traditional computer language it can also be programmed by recording actions. This would be most comparable to the two ways you can create a macro in excel; you can write a macro or you can record your button clicks and have excel write the code for you. Baxter autorecorded programming is the feature that they are touting as the game changer.
In some ways I can see how this will make the human to robot interface more open to people who are not coders or roboticists. The price point also makes this type of automation more accessible to a wider array companies. But this is still a fixed robot that will be doing a single task somewhere in a company’s value stream. I understand that it can be more easily repurposed than other more traditional assembly robots but, if a company was looking at an employee or a robot I am not sure that Baxter changes the current business case assumptions.
In the end I think this is the next step forward for an industry that by its very nature is shifting jobs away from low skilled to high skilled and along the way reducing the total number of people needed to accomplish any given task.