Thursday, November 28, 2013

Student Machinima in OpenSim With Greenscreen

I have a student, 11th grader, working on one of the most complex tech projects ever. Her English teacher gave an open-ended project assignment for students to make something that depicts a scene from Bram Stoker's Dracula. She decided to make a machinima that would require up to four avatars, all controlled and filmed by herself, in one week.
I think Rand Spiro's Cognitive Flexibility Theory describes the use of technology to solve creative problems well. Of course there are optimal setups for creating machinima but here I had to help a student realize a very complex project with minimal training and portable equipment, as she had to complete it over the Thanksgiving weekend.
So here's what I came up with. I will find out on Monday how successful it will have been:

  1. We have a school OpenSim virtual world, but currently port forwarding is not set up in the firewall, so it's not an option for working at home. She has a Mac so she couldn't use SoaS. So we set up her own sim, installing MySQL, Mono, and Diva distro. After a few hiccups she was up and running.
  2. We needed the Diva account functionality to set up the character and camera person accounts and I had chosen Imprudence for a viewer and for some reason the Diva splash page wasn't showing up. I found Singularity which turns out to be quite awesome.
  3. Next was loaning her a PC she could use for the filming. I put Fraps and Singularity on it and taught her how to change her MyWorld.ini and RegionConfig.ini files to reflect her LAN IP so the PC could log into her sim. I taught her how to use Fraps, which is dead simple.
  4. For multiple avatars in the same scene, all directed by her, she would need a green floor and background, which she would then have to edit together in a video editor. She made some nice avatar costumes and developed gestures from the stock opensim animations.
  5. She would have preferred to edit the footage in iMovie on her Mac, as would I, but having the files on the PC in AVI format complicated things, as they would have to be converted to MOV for iMovie. It was too much for me to explain and add to the workflow. So I opted for having her use Movie Maker on the PC. That decision could prove to be the project's undoing as greenscreen is very hard to work with in WMM. We'll see. You have to use WMM 6.0 and install RehanFX shaders, and syncing the overlaid clips is almost impossible. It seems pretty much just set up to make cheesy music videos with ridiculous backgrounds. But it would have to work.
  6. UPDATE: I am happy to say she managed to get the DVI  files copied to her Mac, and used either Perian or Evom to convert them. iMovie makes clip editing much easier, even after applying the greenscreen.
  7. So the greenscreen workflow consists of syncing two avatar clips, applying the greenscreen so they both appear over green. Export that, and reimport it. Add a third avatar clip and sync those. Apply the greenscreen filter again and export that. Repeat for the fourth avatar clip. Finally, reimport that and greenscreen it with the chosen background image, and if possible figure out how to work in multiple background images for scene changes, which I'm not even sure is possible.
I hope all this works, we'll see.
UPDATE: She completed the video and it came out amazingly well! Here it is. She did end up figuring our how to convert the files from DVI to MOV and move them to iMovie. That should teach me to assume something will be too hard for someone.

Monday, November 18, 2013

Programming TETRIX Servos With leJOS NXJ

The last time I taught my high school robotics class I used RobotC to program TETRIX servos. The RobotC API provides the functions servoValue, servo, and servoChangeRate. From the documentation we learned that the only way you can be sure not to push your servo against a physical barrier and damage it is to avoid setting it to a position it can't reach. The easy programming also allowed us to avoid learning about how servos really work. leJOS NXJ has tools for dealing with servos that afford a much better learning experience, in my opinion. The leJOS API provides setRange(), setAngle(), setPulseWidth(), getAngle(), and getPluseWidth(). At the very least you will need to call setRange and setAngle on servos. That's because setAngle depends on a range of movement having been set with setRange. With leJOS it behooves you to set servos to a safe range of movement before moving it around, and to do so you have to understand something about how pulse width modulation makes servos run. Two articles do an excellent job explaining how to servos work , one from Jameco Electronics and one from Science Buddies. But there is still a gap of information when it comes to using the setRange and setAngle methods. The documentation provides the following:

public void setRange(int microsecLOW, int microsecHIGH, int travelRange) "Set the allowable pulse width operating range of this servo in microseconds and the total travel range. Default for pulse width at instantiation is 750 & 2250 microseconds. Default for travel is 200 degrees. " The parameters are defined as follows:
microsecLOW - The low end of the servos response/operating range in microseconds
microsecHIGH - The high end of the servos response/operating range in microseconds
travelRange - The total mechanical travel range of the servo in degrees
To better understand what these values mean I created some diagrams that make clear the function of each parameter.
The minimum and maximum PWM allowed are 750 and 2250, but if you use these  you are in danger of hitting the robot.
If the servo horn is attached such that the servo's physical stops are tilted the arm can hit the robot even with a safe min and max PWM range.
The third argument to setRange sets the number of programmable positions  between the min and max limits.
Setting the travelRange to 10, for example, will greatly reduce the precision it is capable of.

Saturday, November 02, 2013