Saturday

Final Project, V: Putting It All Together

       It was two days before the exhibit when we started moving very fast. We had a head, a jaw attached to it, and eyes and eyelids on it. We also had the code for everything except the jaw. We had crafts materials, such as yarn, paint and glue, thanks to Lyn who took me and Juliette to a crafts store on Marathon Monday. We also had the sound files re-recorded in Marie's voice, because... well, it was just more effective when she said it. So here is how everything came together.
       The week before Chris visited and showed us how to make the NXT play a sound file. This was great, except the NXT was not really audible even at top volume. So I asked Lyn if he could get us a microphone and speaker set for amplifying the sound. And he did! That was one problem solved. After about 3 hours filing with various instruments, I got the jaw to fit into the skull. Only then was I able to test my codes for playing the sound files. I soon realized that the software wasn't going to work before I fixed the mechanical issues. I had to play around with the gear ratio and decided on 1 to 8. I also had to figure out a way to attach the motor to the skull. I had wanted it to be in the box under the skull, nıcely hidden away. However, carrying the power to the jaw's rod was too complicated for me in that case. So, I drilled two holes on the sides of the skull and put two rods through them. I attached the motor to these rods and built the gears next to and on top of it. This proved to be quite a stable structure. Furthermore, I took off the tape that held the other two motors together and used Lego pieces to stabilize them. This took quite a bit of time and loads of help form Lyn.While I was doing that, Christine and Marie were building a paper-mache head for Wendy. Marie also found a ridicolous little nose for her on the crafts table that we later attached to her face. While the paper-mache was drying, this is how the final mechanism looked:


Sideview of the finished mechanism

Top view into the "skull"

Bottom view showing the many pieces making the structure stable
The paper-mache head drying next to the box holding the sensors and bricks
       The next day, we all came together to work. I had started to make a wig for Wendy and Christine took that over while I worked on the jaw program. I wrote the code for each phrase separately and tried different angles for different syllables. I ended up getting a code for gently opening and closing the mouth in a generic talking motion which lasted different amounts of time for each phrase. Then, Marie helped me put these in a case structure to alternate between phrases. I did not want to put them in an infinite while loop and create an unpredictable puppet. Instead, I incorporated two sound sensors into our design at the last minute. While Marie and Christine were painting the head an the face, I was getting background noise readings of the Leaky Beaker. Then, Marie and I changed the program so that our puppet would say "Come closer." when the sound level was a little over the background. We also got readings for a conversation and used the average of these values for a speech threshold. When this threshold was reached, Wendy would say "Hi!" or "Hello!" If the sound level remained above the threshold for a period of time after this, she would proceed to say "Hmmm.... interesting!" or alternatively, "Tell me more!", assuming that someone had been talking to her all this time.  At this point, her cosmetics were also done. We put her head, her eyebrows and her wig on, and she was ready to go! Well, except her gigantic head. We came up with an idea to cover this up by putting a hat on her head. Surprisingly, she looked pretty good this way, and a lot like Marie!

Wendy with her makeup on
The resemblance is undeniable!
        We were finally ready! We moved Wendy, her box, the speakers, the microphone and my laptop upstairs to the exhibit area. The microphone and speakers needed to be attached to my computer at all times to function. The speakers also needed to be plugged in. We put one speaker on each of the two front corners of our table. The microphone was taped to the NXT producing sound and stowed away under the box. The eyebrows and eyelids were set. Our sign was in front of us. We turned her on!
         Wendy did OK but was not very reliable during the exhibit. She had a couple of issues. First of all, the eye code depended on a background reading to detect the motion. This reading changed with every restart. If someone was standing in front of one of the sensors at the first second, then the eyes wouldn't function properly. We had to reset many times until we got a working setup. The talking part was also flawed. People weren't able to tell apart what she was saying. When we explained it to them, they started talking to her but she didn't respond quickly enough because of the wait we had put between the phrases. Some of her parts came off during the resetting process and needed to be fixed. Therefore, we kept taking off her head and fixing her constantly. Marie fixed most of these issue for the Maker Faire the next day. With the eyes tracking properly, she was truly creepy. Despite her imperfections, she provided amusement for a good while. We put her up in the studio, facing the back entry to Science Center so she can happily creep on many others to come.
Lyn and Marie at the end of the exhibit: Lyn is working that wig!

Friday

Final Project, IV: Making the Head

       While I was working on the jaw all this time, Marie and Christine had already produced a code for their parts. They had used 3 ultrasound sensors for detecting movement. The code determined which one or two were covered and this gave the direction in which a person was moving. The eyes would follow this direction, while the eyebrows and eyelids would change to the "alert state" by opening and rising when motion was detected. This was great but they needed to test this on the actual parts because they needed physical constants for their code based on the material we used. I did some googling to find out what the average person's eyeballs and eyelids' size were. We ordered 1 inch polypropylene balls for the eyeballs and standard ping-pong balls for the eyelids. They arrived a few days after Marathon Monday and were added to the Lego mechanism on that Friday. Christine and Marie did this by drilling a hole in two of the plastic balls that fit a rod. They also cut one ping-pong ball into to to get to eye lids. At this point, we had a preliminary structure to put the mechanism behind. Here is what the face looked like:
The face and eyes, being held by Christine in this picture
         To build the face and the back of the head, I first measured our
heads. I looked up the relative positions of facial features of an average person. Next, I drew an ellipse on SolidWorks that was the average of our faces big. I put two holes for the eyebrows and two holes for the eyes. I cut the part from the imaginary nose below for the jaw. We thought we would wire fit the face with the box structure around it, so no other holes were made. I also built a box structure, which we can call the skull, to rest our mechanism in. It was 15 cm high, 13.5 cm wide and 12 cm deep. We decided on these dimensions after looking at the Lego structures and how much space they took. This box structure was to be connected to the face at 4 cm. We printed these parts on 3/16 inch thick Delrin sheet. With EJ's help, I drilled holes on these parts so that we could put the wire in. However, when the time to cut the piano wire came, me and Lyn had serious  difficulties. There was no way to cut so many pieces of wire and be able to  locate all the pieces flying across the room in a reasonable amount of time. I hadn't wanted to use press fitting because I had found it hard to calculate and not very reliable during our bird project. However, I had no option so I got to work. Briana had been doing a lot of press fitting for the Creep and she kindly helped me understand the principle behind tight fit. I used her notes to calculate the holes and draw a new head structure for Wendy. I first printed two rectangles: one with a hole and one with a piece sticking out to try my tight fit. It was very tight and I was happy with it. I printed the rest of the parts using the following files:



Two of this part made the sides of the skull

This is the back of the skull.


The face
        The fıt was so tight that I needed to use a hammer to put all the pieces together. This was great beause it also meant that it'd be hard for them to come off. Marie and Christine taped their Lego motors and put the mechanism inside. They started debugging their code. I started to get the jaw ready to put into the skull. Due to an error in my calculations, it was about 5 mm too wide to get into the skull. While they were trying to make the eyes follow reliably, I was filing away my precious jaw...

Final Project, III: Figuring The Jaw Out

       The first thing I did next class was build a new jaw out of Lego. This was a little too puzzling to me for some reason so I decided to examine A.L.F.I. for some clues. I also read the blog on its development. Then, I consulted Lyn and finally built a hinged jaw. The result was quite simple and I was a little embarrassed bacuse it took me so long to wrap my head around it. I quickly moved on to the programming part for now.
       The biggest challenge I had also became apparent after a chat with Lyn. I had assumed that we can record ourselves talk, program the NXT using these files and the sound would come from the brick. Well, it wasn't that easy. We didn't even know if the NXT could play such sophisticated sound. We had heard it only beep and play the short start-up melody so far. Lyn suggested that I program the NXT to send a message to a device that can play music via Bluetooth. He said he could supply me with an android phone or something of the sort for playing the sound files. Therefore, I started doing my research on how to get an NXT to play a sound file and alternatively communicate via Bluetooth. By the end of the class, all I had visibly accomplished was make the NXT play a barely audible 'Is this working? What the hell!?' in my voice when İts button was pushed. That was far far away from my goal but I learned a lot of things during this class. I learned that the sound file type for LabView is rso and got the software for converting other types of sound fıles to this. I also got softwares for recording and changing sound. Additionally, I tried using text-to-speech software for creating a robotic and creepier voice but this did not work. I also started my work on deciding on phrases. The end of the class dıd not look hopeful but I knew we would figure it out. I knew we had to because Deepika and Lisa's project depended on being able to play music. Chris was out of country and Lyn said he would look into it. I remember that I left the class thinking this will be harder than I had thought.
        I decided to focus on the building of the jaw until I had more information on how to make the NXT talk. Around the same time, I sent my team mates a list of the possible phrases for our puppet and asked them for their opinion. Together, we decided on the following sentences:
Hello!
Hmmm... interesting!
Tell me more...
Come closer...
And how do you feel about that?
        The last one got eliminated later because it was too long and we didn't have enough memory space on the NXT brick to put all these files. We replaced it with a 'Hi!' as an alternative to 'Hello!'. The jaw wasn't as easy, unfortunately. Another helpful chat with Lyn guided me in building the jaw in SolidWorks. I was really excited because I found out that we could use the 3-D printer for our jaw. After a few hours, I had a part I could print. Or so had I thought. Now I had to figure out how to put an axle, specifically a Lego rod, through it. I reasoned that getting an extruded-cut on the jaw in the shape of the rod's cross section would give the expected result. I talked to Briana on Lyn's suggestion because they had made circular holes for these rods. She gave me the diameter of the hole and I used that to build a cross on a rectangle. Lyn had advised me to try the hole before I print out the jaw since it is a long and expensive process. So I printed a small cube with a hole in it using the 3-D printer. This took us about half an hour but unfortunately it didn't fıt. So I tried again, making the hole larger by 0.1 mm on each side. This did not work either. My next trial involved three pieces, each larger by 0.05 mm than the previous one. Trial 3 also failed. I got better results on the fourth trial with three pieces. The smaller two pieces could be pushed on to the axle with lots of force and the largest one fit very tightly and well with relative ease. This larger axle with 1.85 mm sides made me very happy:

Success!
      After a little resistance from SolidWorks, I was ready to print the jaw. We had about a week left and needed to speed things up. When we oriented the jaw for most efficient printing, however, the orientation of the hole changed. We didn't know if the dimensions would still be right this way so we printed the right-sized cube in this orientation to find out. The result was disastrous, mostly because of the debris that had piled up around the tip and in the bottom of the printer. Lyn cleaned it up and we decided to print it in the less efficient orientation that was sure to give us a correctly sized hole. This took a whole night. I learned how to use the base bath and put the jaw in there so the support material would melt. It looked like this before the bath:
Fresh out of the printer!
         While it was getting ready in the bath, I had other things to worry about, like giving it a head...

Final Project, II: Decisions

       The next week, we had a number of decisions to make. We needed to fıgure out how we were going to achieve the motions we wanted, what material we were going to use and who was going to do what. A fair amount of emailing and some in-class discussion led to results. We decided that we would use ultrasound sensors for detecting people and triggering eyeball, eyebrow, eyelid and head rotation. We had initially considered cameras but found out that ultrasound sensors are much easier to work with. We also decided that we would build a face out of Delrin sheet and cover it with a mask of some sort. Ping-pong balls were the material of choice for both eyeballs and eyelids. The inner structure was to be built out of Lego. Then, we discussed who would do what. Since we had three people who showed clear interest in a specific part of the puppet, we thought dividing the project by parts would be the most efficient. I was really excited about the creepy speech so I took on the jaw and speech programming. Marie was into making crazy expressions and winks, so she stook the eyelids and eyebrows. We had just watched the famous scene from the Exorcist and Christine took the eyeballs and the head. We wrote out a plan of action for the next weeks up to the fınal exhibit.
       Next step was building Lego models of our puppets. We each built an individual structure, which at the end we were able to put together for our presentation. I built a jaw that opened by an up and down mechanism using a Picocricket. I was trying to minimize the number of motors and NXT bricks we use because they are large and take up too much sapce to fıt ın a head. I also explored other options that were available to us, like servo motors. However, my model showed me that we needed a hinge mechanism, rather than up-and-down, for a more realistic puppet. This meant that I needed to be able to control the angle that my jaw would open to for producing speech-like angle variation. This eliminated the other options and made it clear that I needed to use an NXT motor. Christine built eyeballs that slided in their sockets and Marie had eyelids and eyebrows attached to them. Our model looked quite impressive at the time we presented it to the class. It also provided us with valuable information on how we should proceed. After this point, I focused mostly on my individual duties while Marie and Christine worked more together. Their parts moved in concord and they depended on the same stimuli so they wrote their code together. I only know the basic principle behind their program and have seen it only a few times. From now on, I will talk mostly about my own contribution to the project.
      

Final Project, Part I: Ideas and Presentation

      It was finally time for our final projects. We had already been tld that the theme would be "puppets". Me and Christine were already partners and we stayed so. Marie also joined us as we came together to become the only and most awesome trio of the class. We had a number of ideas and had a lot of fun discussing different kinds of puppets. We went online and looked at many examples on the first day. We saw a lot of animal puppets, some marionettes and a few sock puppets. I liked the idea of a dancing puppet and suggested we build one. I thought we could incorporate many popular songs and dance routines into our project, such as the Macarena, The Ketchup Song and TunakTunak Tun. We went through many versions of this dancing puppet before we decided that an elephant would be the greatest. We wanted this elephant to be able to turn on its bottom, have huge ears that move when there is a sound, and a waving snout. We also wanted it to be able to do the dances associated with the previously mentioned songs. This was the first of our ideas that we really developed.
                Another idea we had was “Wendy Wellesley” puppet. We wanted a puppet head that looked like a regular girl from Wellesley College. She would be a good friend who listened to you said reassuring things. We also wanted to make sure she looked like she was paying attention. Therefore, we would program her eyeballs so that they followed people around her. Her eyelids would close and maybe even wink. Her eyebrows would also move to produce expressions. She might even yawn.
                A couple of cool ideas came from Christine. She suggested sock puppets. We’d have a boy and a girl sock puppet that would pretend to talk to each other when they detected conversation. They would recognize pitch and the boy would assume the identity of the person with lower pitch while the girl would talk when the higher pitched person talked. We also talked about having a move that they would do when someone talked for too long. Another idea she had was a shadow puppet version of Wendy Wellesley in case the design was too hard to carry out.
                We presented these ideas to the class and got feedback from them. It was clear that the theme of the year was creepy robots. The following eyes of Wendy fit quite well into this theme and it received positive responses. The elephant was also popular.  As a result, we chose Wendy and the elephant as the projects we wanted to focus on.We realized that another group also considered building a dancing robot so we decided to develop Wendy further. We added some other features, suchas full head rotation, being mounted on the wall and creepy lines, to strengthen the creepiness of our puppet. We then presented our more-developed ideas to Chris at Tufts. This visit confirmed Wendy as our final project.

A Bit Of Programming, II

       During the next class, we continued with the theme of programming. This time we used MatLab for the first time. We were given a link to an online book on MatLab and several exercises to program. Although in retrospect the first four were relatively easy, it took us a long time to figure out what to do. I knew some physics and Christine Java, and with loads of help from Lyn and Chris, here are the results:

Prompt:
(The expression for calculating Fibonacci numbers is given.) Translate this expression into MATLAB and store your code in a file named fibonacci1. At the prompt, set the value of n to 10 and then run your script. The last line of your script should assign the value of Fn to ans. (The correct value of F10 is 55).

Program:
Prompt: car update that updates the number of cars in each location from one week to the next. The precondition is that the variables a and b contain the number of cars in each location at the beginning of the week. The postcondition is that a and b have been modified to reflect the number of cars that moved.
Imagine that you are the owner of a car rental company with two locations, Albany and Boston. Some of your customers do “one-way rentals,” picking up a car in Albany and returning it in Boston, or the other way around.
Over time, you have observed that each week 5% of the cars in Albany are dropped off in Boston, and 3% of the cars in Boston get dropped off in Albany. At the beginning of the year, there are 150 cars at each location. Write a script called
To test your program, initialize a and b at the prompt and then execute the script. The script should display the updated values of a and b, but not any intermediate variables. 
Program:
Prompt: car loop that uses a for loop to run car update 52 times. Remember that before you run car update, you have to assign values to a and b. For this exercise, start with the values a = 150 and b = 150.
Create a script named
Program:
Prompt:
Experiment with the following simulation of the trajectory of a hit baseball. For what angle is the horizontal  distance maximized?
                    % Model of trajectory of hit baseball. Assumes no air drag. 
% What angle maximizes the distance travelled? 

clf % clear the previous figure
hold on % allow plotting of multiple figures
angle = 30; % angle baseball hit, in degrees
velocity = 50 % initial velocity of baseball, in meters/sec
rads = angle * pi / 180;
velocity_x(1) = velocity * cos(rads); % horizontal velocity of baseball
velocity_y(1) = velocity * sin(rads); % vertical velocity of baseball
x(1) = 0; % x position of batter
y(1) = 1; % assume baseball hit 1 meter off ground
dt = 0.1; % time step for simulation, in seconds
g = 9.8; % gravitational constant, in meters/sec^2
i = 1 % iteration index
while y(i) > 0 
    x(i+1) = x(i) + velocity_x(i)*dt; % Update x position 
    y(i+1) = y(i) + velocity_y(i)*dt; % Update y position
    velocity_y(i+1) = velocity_y(i) - g*dt; % y velocity changes due to gravity
    velocity_x(i+1) = velocity_x(i); % x velocity doesn't change (assume no air drag)
    plot(x(i), y(i), 'r.-');  % display a red dot for each point, and connect them with lines
    i = i + 1; % change index for next iteration
end
x(i) % Display the final x value.

Program:


Prompt:
Below is a simple model of a DC motor.  Implement this model in MATLAB by making a simulation motor.m similar to the baseball trajectory simulation above.  Note that the Greek omega is the angular velocity (in radians/second) and the Greek tau is torque (in Newton-meters).  In one time step dt, the rotation changes by omega*dt radians, where there are 2*pi radians in 360 degrees. Also in one time step, omega changes by (tau/m)*dt, where m is the moment of inertia of the motor.

Assume that you run the model until the rotation is 90 degrees.

You may assume the following values:

V_terminal: 5 Volts
K_motor: 0.3 Newton-meter/amp
R: 25 ohms
m (moment of inertia for motor): 0.0001 Kilogram-meter^2
dt = 0.001 (simulation time step)





and then,
In a file motor_control_prop.m, develop a proportional control motor controller for the motor model from #6. Introduce a variable K_prop that controls the proportional gain. Design your program so that it runs for several simulated seconds and displays the last time t_last_bad the angle was more than 0.1 degrees away from 90 degrees. Fiddle with K_prop to minimize t_last_bad.  What are your values of K_prop and t_last_bad for the least t_last_bad you observe?
Program (proportional control):

A Bit Of Programming, I

      The next few days were spent exploring ways of programming the NXTs and MatLab. There were a number of project we did during this time. The first was getting our motor to move to exactly 90 degrees. I think we might have done this before we made the ramp climber. In any case, here is what we did: We wrote a program in LabView that said go forward, wait for 90 degrees and brake. We also had a program that allowed us to read the rotation from the NXT's screen. We observed that, due to the time it takes for the NXT to realize it has gone 90 degrees and put the stop command into action, it always moved more than 90 degrees. Therefore, we changed our program so it would move back when it overshot.

Our code for moving exactly 90 degrees

      Next, we were to build a program that modelled the NXT motor controller. We used the NXT's data logging feature to collect data. Then, we put this data into Excell and got line that predicted the motor,s motion.

The data from the NXT


Here is our program for the NXT motor controller:

Here is the final graph we arrived at: