iPad Focus Group Reflections

 

Overall, I loved the iPad and I am so glad to have been given the opportunity to try it out FOR FREE. It has shown me that I need one and it is essential to school life. The iPad has helped me stay organised because all my files are in one place, and kept my back straighter due to the fact that I am no longer weighed down by all the notebooks I have for each class. There are some downsides to the iPad, like how I can’t print with it or I can’t use programs that require Flash, but all of the positives outweigh those little negatives. My feedback for the IPFG is all positive. I loved the program and if it runs again next year, I would be enthralled and would definitely sign up for it.

Jan 9, 2017: Programming the Ultrasonic Sensor (S)

  1. How does an ultrasonic sensor work?
    The ultrasonic sensor sends out a high frequency sound pulse and then it waits for the sound to be echoed back. The time it takes to be echoed back is the distance it is away from the object. This distance is shown on the debugger window in RobotC. The input and output comes from the face of the ultrasonic sensor. On the front of the sensor, there is two circles on the front, one of them is a speaker and one of them is a microphone. The speaker sends out the sound pulses and the microphone receives the echo back and calculates the distance the object is away from the sensor. Every so often, the ultrasonic sensor will output a sound and get back some input information.
  2. How do you use the ultrasonic sensor in RobotC and the cortex micro-controller? In RobotC, there are a few sample programs that relate to the use of the ultrasonic sensor. Since this is the first time I’ve ever used one, I took a look at the sample program.
    trying to understand
    The sample program begins with a “wait” function and then a while (1==1) function. The 1==1 thing only means that the statement is true because one always equals one. Then there is a while loop, which is an endless loop that repeats everything again and again within it’s brackets. The while loops condition is that while the value of the ultrasonic sensor is above 10 inches, it will continue to do what is inside the loop. Inside the while loop of the sample program, the motors will move in opposite directions making the robot scan for values while turning in circles. That’s what I got out of reading the sample program for the first time. Everything seemed to make sense. Then, I tested it out on the robot and it moved about a centimeter to the right and stopped. I was extremely confused so I spent about one class trying to figure out what was wrong with my program. I didn’t manage to figure it out. During the next class, Mr. Lin showed us what the actual program was supposed to look like and I reprogrammed my own version of what I learned.
    understood and self programmed
    In this program, the robot will go forward until the sensor sees an object 20 inches or less away, then it will turn for 2 seconds, then restart the loop. This was my first time using a while loop and I decided to experiment a little more on my own. So, on Google Script, I coded something on my own at home.
    Trying things out
    In this code, I’m using a new form of programming to me. It’s called “if/else statements”. In this code, the robot waits for one second, then starts the loop. The robot will go forward until it spots something less than 20 inches away. If it does, then it will move on to the else statement where it turns right until there is a route where there is no object 20 inches in its path. I’m waiting to test this new code out in the future. 
  3. In this unit, I learned how to use a while statement and if and else statements. I thought these were really helpful and I will most definitely be using them in the future.

Robotics Dance Challenge Reflection (S) December 2

In robotics, we were assigned a project: to teach a robot how to dance. Through this project, we learned more about the different types of code, and functions that are put into making this robot do something. We used the cycle of design thinking in order to create this robot’s dance. We started off first with brainstorming ideas of which song we wanted to use and started the initial thinking for the dance we wanted the robot to do. This part gave us access to creativity, as we got to explore, and investigate which songs we could use. After choosing the song, I started to program. In order to start this process, I first brainstormed and did some initial idea collecting to think deeper about how exactly I wanted to program my robot to dance. Throughout the project, I faced many challenges which I will discuss later in the blog post. After programming the robot for the first program, I then tested the robot in virtual world to see if the program worked. When it didn’t, this would bring me back into the brainstorming stage of the design thinking, and I had to try to overcome the problem that I faced. This cycle of testing and remodelling continued throughout the project, until my robot worked, and danced in the way I wanted it to.

The entire process took around 3 weeks to do. 3 weeks to program a dance under two minute dance.

I spent the first class picking music, the second class programming, and the classes after that programming too. I thought I wouldn’t have enough time so I brought my PC home twice. Only one of those two times did I actually work on the program. Nevertheless, I finished it on the due date and I’m very proud of what I have produced.

November 9th reflection:
Today in class, we continued to work on our robotics dance program. I edited the song on iMovie at home in advance so I could start programming today.
I managed to program 20 seconds worth of robot movements out of the desired 1 minute 40 something second-long dance.
Everything’s going great. I’m glad that I took the extra step to edit the music at home because it gave me more time to do what I can only do in class, program on the school’s PCs.
Here is the audio that I edited at home. I simply just slowed down a few parts and cut other parts out just to make it easier for the robot:

 

November 30th reflection:
Today, I finished programming the entire dance and trying it out on the physical robot. Good news, the physical robot works. Bad news, the arm motor is inverted between the physical robot and virtual worlds. I should work on either changing the program by inverting the motors, but that won’t take a lot of time. I should be ready for December 2, when the project is due.

I did encounter a few challenges though. The first one was picking music. I found it very hard to find a song with a clear beat, but I ended up finding one with the help of a friend. The song I used was “Power” by Will i am ft. Justin Bieber. Next programming was very confusing because I had to match it up to the song that I picked. I ended up editing the song on iMovie by slowing down several parts and breaking up the music so it would be easier to program. Programming in general was kind of complicated because I wanted to program something that worked, but also something complex. Complex in a way that the program contained things we never learned, like multiple source files, but also the robot in general. The robot only has a few simple moves, so I had to make things more interesting by doing things like multitasking. The third problem was timing. It was very hard to get the robots moves to sync with the music. So, on iMovie, I went on the music file that I had already edited and began to bisect areas of the song so that it would so me the timing (minutes and seconds). This made it a lot easier to see when the beat hit or when I had to program a new move into my program. On the program, I left comments beside several lines so I could see where in the song I was programming and only edit that part. Finally, when testing out the program, my arm motor was inverted in virtual worlds. This was extremely annoying because I specifically program the arm motor to do something that required it to go in that direction. Fixing the problem was simple though. I didn’t want to take apart the robot because I didn’t want to risk the chance of breaking it like many other groups that resulting in using our robot, so I simple changed the motor direction in the program.
I mainly fixed all the issues listed by Google searching, using sample programs, and asking classmates or the teacher, Mr. Lin.

In the end, this was my program. I like it because it is short and simple. In order to fully edit this program, I had to go into the tabs at the top of the program; for example, main moves. There, I would be able to edit any part of the program that I group into that tab, same with the other tabs. This is called “multiple source files”. To have multiple source files, create a couple files of code, then save them into one folder. In each file of code, there should be a few lines of code and in that folder, there should be one file for the main program. In the main program, whenever you need something in another file in that folder to be the next few lines of code, type ‘#include’ then ‘”(name of file).c”‘. In my case, one of my files names was “Preprogrammed Codes” and the “.c” is the type of format the file is in.

IMG_6978

However, there are other ways to display the program without using multiple source files. The original format I used was basically programming everything on one program, resulting in 200 something lines of code, which is very hard to sort through as it gets over 100 lines.
Here is the code:
https://drive.google.com/drive/folders/0B6z4PNCjdoh_c2VhVXBWaG1YMlU?usp=sharing

And here’s the video for the final project:
https://photos.google.com/share/AF1QipP-0FigxEn0xoP6uDtDavXlrmkcASA8nuA8bnbVin6XexU9zSsKpY4ktwG0MkuElw/photo/AF1QipPU7FlZ7KgUwbjVhKFbvclkTS1RbNHK-U3cdVtA?key=VWJJLUp6eHQ3dVR1c2otQWUwckt5N3JBOXVYSzR3

Overall, I would rate my project as a success because not only did I manage to program the robot to dance, which was the basic requirement, but I also learned a lot of new techniques in my coding that I look forward to using in the future.
But I could improve on making the dance more entertaining. I noticed that a few students had added extra parts to their robot or stuck on sheets of paper. Even though doing that doesn’t take very long, it does amuse the audience so I should look into trying to instil something creative into my robot that is unique and different from everyone else’s, distinguishing my project between others.

I really enjoyed this project so I encourage it to continue in future years. The project not only taught me more about how the design thinking process works, but also how it applies specifically to robotics. It was also interesting to see how everyone’s robot danced in a different way, thus shows that even in robotics, there is so much creativity involved in the process, kind of like a form of art (even though it’s not a fine arts credit, which I also highly encourage to be taken account for).

Here’s some pictures of my robot:
IMG_7095IMG_7096

Robotics Dance Reflection Post November 30th

Today, I finished programming the entire dance and trying it out on the physical robot.
Good news, the physical robot works. Bad news, the arm motor is inverted between the physical robot and virtual worlds. I should work on either changing the program by inverting the motors, but that won’t take a lot of time. I should be ready for December 2, when the project is due.

Remote Control Program (F)

In unit 4, we are learning to use remote controls. For this task, we must write a program in RobotC that can make the movements of the robot able to be controlled by a remote and trigger a function to do a set of pre-written action(s)
Here is the program to control the robot using a remote control.
Screen Shot 2016-11-16 at 9.37.34 PM

The while (1==1) thing is called a while loop. As long as the things inside the parentheses are equal to each other, everything below it will happen. 1 always equals to 1 so everything below is true. The commands below the while loop are assigning different functions to different buttons. Channel (Ch) is the name of the joysticks and Btn is short for button which, in this program, is used to control the arm motor. The challenge for this was just trying to understand this program as a whole. Especially with the arm control part as it has many lines of program to move a single motor back and forth.

The second program is used to trigger a prewritten function:Screen Shot 2016-11-16 at 9.37.20 PM
“Void” and everything following it in the brackets is not preformed unless called upon. The function’s name is mentioned directly after the word “void”. In this case, it is called walkForward. In the lines after that, what the function is suppose to do is preprogrammed. This preprogrammed action is going to make the robot wait for two seconds and then go forward at power 127. Inside the parentheses after “walkForward” at the very top line, it says “int time” because it is letting you decide the time in task main. Now, to activate this function, go below to where task main is and simply right in the function name, walkForward, and in parentheses, the time the function runs for.

 

Robotics reflection post Nov 9

Today in class, we continued to work on our robotics dance program. I edited the song on iMovie at home in advance so I could start programming today.
I managed to program 20 seconds worth of robot movements out of the desired 1 minute 40 something second-long dance.
Everything’s going great. I’m glad that I took the extra step to edit the music at home because it gave me more time to do what I can only do in class, program on the school’s PCs.

Remote Control Questions

1. What is remote control?
Control of a machine or apparatus from a distance by means of radio or infrared signals transmitted from a device.
However this control doesn’t have to be wireless, it only requires indirect contact with an object to make it do something. As long your not directly touching the object to make it do what you want, it technically is the process of “remote control”.

2. Why use remote control?
We use remote controls because it allows us to not only, make corrections on the fly as well as achieve distances that might otherwise be achievable by a person/pilot.
We have also established the fact that humans cannot do everything. There are many different types of scenarios or climates that humans are unable to live in and therefore, we use remote control to get something else to do the job for us.

3. Different types of remote control?
Air-conditioner remote, TV remote, Xbox/PS controller. These are just the controls perceived by people to be remote controls. By the definition of indirect contact to control an object, other types of controls could be the mouse connected to your computer, the steering wheel to your car, your light switch, etc.

4. Spaceship remote control challenge?
Because transmissions between the ‘ground control’ and spaceship, due to the great distance between them take a lot of time, therefore feedback for correction that isn’t entirely visible to the spaceship isn’t automatic. For example, if the spaceship breaks, you don’t know until around 7 minutes later if we’re talking about waves travelling at the speed of light. But we don’t use humans because the cost to get them (us) up to space is very expensive; even more than using remote control. Humans need food, warmth, sleep, and they are irreplaceable (as an individual).

5. Remote control improvement:
Create a universal remote where it can be used for multiple applications outside of their own ecosystem. Such as combining an air-con remote with a TV remote. This creates centralisation and something that is much easier to organise. Apple is creating something like this called the HomeKit, but the price to buy the appliances to use the application is very pricey. So, create a CHEAP universal remote that works with everything. This way we can be more organised and only care about one remote instead of fumbling around for one out of the hundred remotes that we own.

6. VEX remote control. How does it work?
There are a couple of ways that the remote control interacts with the robot. One way is through a hardline connection between the control and the robot’s brain (cortex) through different ports. Basically, the use of wires connecting the robot to the controller itself. The other way is through a USB port that contains programs of commands that each button, and shift in the joysticks that determine it’s movement. There is a port for a USB at the bottom of the control. If you stick in a USB with preprogrammed buttons it in, then you can control the robot wirelessly through VEX remote control. For example, if I program joystick 1 to: if joystick 1 is moved up, then the robots arm moves up. So then whenever you move joystick 1 forward (or up as we programmed it to be), the robots arm will elevate (move up).

October 25 Robotics Reflection

Today, we refresh and review our knowledge on shaft encoders and took a formative test on the robot and its parts.
The revision was really helpful and the formative test was kind of confusing as there was things like two different screws.
Then we were given some time to work on the six step challenge that was supposed to be due last Friday but due to the typhoon, it was postponed.

Robotics Unit 2 Reflection

1. What was the challenge about?
We, in groups, had to program a robot to follow a track using ROBOTC. The challenge was basically trying to make us apply knowledge that we had already learnt and using it to create something that we could show. A task (challenge) to prove that we understood what we were taught.

2. What did you know before working on the challenge.
We worked a bit on ROBOTC before this task. We knew how to make the Squarebot go straight and turn, which is all the knowledge we needed. However, if students wanted consistent data, they had the option of use shaft encoders. We were taught to use shaft encoders by a website and a bunch of videos. Most of the things we learned before working on the challenge were self-taught.

3. What were the technical difficulties for completing the challenge? List them out and explain why they were difficult.
The unbalanced power between the motors and the loss of power when the battery drains out.
The two motors that are connected to the wheels on the robot may run on different degrees of power. One of the motors might spin faster than the other one with the same amount of revolutions. This makes the robot turn slightly as it moves forward. So instead of getting a straight line needed to finish the challenge, we got a small curve over time. This is extremely annoying if you want the robot to finish the track as smoothly and accurately as possible.
The second thing is the battery connected to the car. As the battery slowly drains, the robot’s motors get less powerful, which means the robot travels a shorter distance with the same program and we had to compensate for that by changing the revolutions of the wheel.

4. Describe the approach(es) (e.g. time vs shaft encoder) you thought of during the process.
While some of the people used shaft encoders we used time. I had very basic knowledge on shaft encoders and I knew they were more consistent but I chose to use time because that was something I could control for the robot, I didn’t know much about shaft encoders, and other people were struggling. The thing about time is, as I mentioned before, it is less consistent, meaning the data changes every time you try it. But we used time to get a good estimate about how long the robot’s motors should run for. To find out how long the motors should run for when going on a straight line, we made the motors run for a few seconds ( I think it was two seconds) and then measured how far the robot travelled in that time. Then we found the unit rate (how long the robot travelled in one second) and multiplied that time by every length of the track to get a good estimate for the program. It worked really well for lines because lengths are consistent. Unlike turning and the rotation of the robot. Finding the correct amount of time the motor had to run for to make the robot turn the correct angle was a little harder because the robot had different results when during. To find an estimate for time. We found a correct amount of time the robot had to turn for one angle, then divided that by the degree of the angle from 180 degrees. For example, if the angle was 75 degrees, it would be 105 degrees from 180, which is how much the robot had to turn. This only gave an estimate. To find the correct line travel and turning time, we used trial and error from those estimates.

5. Did you use the virtual world to test your program? Was it useful?
No. Virtual world is perfect, it’s virtual. When I try out a program in virtual world and it’s perfect, it gives me too much hope. Then when I try it out in reality, the robot goes everywhere and the hope is lost. Virtual world is only an estimate, like the approach from questions 4, but if you want it to really work, you have to try it out in real life. It’s useful if there are too many other robots on the track and you desperately need to test out your program.

6. What have you learnt in the process? Any new insight?
Talk to your partner, it really helps. My partner was the one who suggested unit rates when it came to straight lines. This helped me complete the rest of my program. Also, it doesn’t hurt to spend a little extra time on something you care about/ are interested in. I spent an extra 50 minutes working on my program on one of the community meeting mornings, and it really paid off as my program ran pretty accurately.

September 13 Robotics Reflection

Today, we learned how to track the distance of a robot using two sensors called shaft encoders. These small red blocks go through the exact same axle that the shaft, or the gear, goes through to turn the wheel. What the shaft encoders do is that they count the number of rotations made up the robot within a distance.
We were introduced to a website: http://www.education.rec.ri.cmu.edu/products/cortex_video_trainer/lesson/index_movement.html, that taught us how to code a short program to use these sensors. By going to “open sample programs” we used a precoded program and watched a few videos that taught us how to program it by ourselves.
Here is a photo of the program of the shaft encoders:
Screen Shot 2016-09-13 at 7.46.02 PM

I also fixed my program from last time using trial and error:
Screen Shot 2016-09-13 at 7.46.11 PM