Transcription

CS371 Spring 2007Prof. McGuireAssigned: Mon. 3/5/07Due: Mon. 3/12/07 8:00 pmProject 0x04: FX1BeforeAfterThis week you will create digital effects similar to those used to produce the StarWars and Matrix movies. The images above show the raw data captured on a filmset and the final composited image produced from the program that you willimplement. Note the new background, reflections of the actors, glowing lightsabers, and correct gamma adjustment.This is not a cumulative assignment, so you will not be able to reuse much codefrom previous weeks and will not re-use this code in future weeks unless youchoose to a final project based on film effects. This gives you some liberty to makeyour code relatively specific to the input data, however you should still exercisegood software design judgment in structuring and documenting your code.The documentation requirements for this week’s project are more detailed than forprevious weeks because you need to demonstrate the intermediate results of yourcomputations. Read them carefully before you begin!You may work with one other person on this project. Many steps of the videoediting process can be made easier if you are skilled in the use of software likePhotoshop and Final Cut Pro. Consider forming your team to balance art andprogramming skills.1Special thanks to Kathleen Creel, Daniel Fast, Scott Olesen, and Kate Foster for helping tocreate the video input sequences for this project.1

FX SpecificationImplement a digital effects pipeline for movie frames that performs matting,compositing, and bloom on specific props.1. Devise and implement parsing of a human readable (text) configuration fileformat that specifies:a. the name of a series of input images,b. the name of a single or series of background images,c. the name of a series of output images, andd. any constants that will be needed by your program2. Read the name of the configuration file from the command line argumentsto your program. I.e., you should not need to recompile your program toprocess a different input sequence.3. Load monochrome .PNG files with sequential filenames4. Perform nearest neighbor Bayer demosaicinga. You MAY NOT use the G3D Bayer functions or look at theirsource codeb. You MAY NOT use Bayer demosaicing code found on the internetc. You may use any other part of G3D5. Crop the input image to the specified bounds or otherwise remove garbagefrom the original film set6. Use green-screen matting to separate the actor from the background in theinput sequence7. For the light-saber sequences, make the light sabers appear to glow byapplying a bloom-like effect only to them8. Composite the results against a new background (or series of backgroundimages)9. Gamma correct the final image10. Write the results out to a series of images.11. Create an MPG, MOV, or AVI file that can be played on the Macintosh inthe Unix lab. The movie will be worth substantially more points than theother steps when graded. You do not have to automate this step.If you choose to add additional non-programmatic effects during the editing of yourmovie, you must submit a separate movie that contains only effects produce by yourprogram.You will find many Photoshop tutorials on the web that describe the light sabereffect and green (or blue) screen matting. These can be very helpful in designingyour algorithm. However, excepting extra credit, your final implementation must bepurely algorithmic. For example, you cannot manually select the light saber in eachframe as part of your effects pipeline.You may choose to use any language, platform, and libraries you wish, althoughonly C and G3D on the department’s FreeBSD systems are supported. If you2

use C as I recommend, please use Java (yes, Java) coding conventions for yourC code.Implementation TipsYou can find several video sequences at /usr/local/371/data/video. Sample Bayerbefore and after frames are in /usr/local/371/data/video/bayer.Unless you are implementing some form of temporal coherence (for extra credit),there is no need to have more than one frame of animation loaded at once. Set upa loop to load each image, process it, and then move on in order to save memory.The original Blue Screen Matting paper, which described the technique that hadalready been in use for about 20 years is inn-s96.pdfYou don’t have to implement Vlahos’ method exactly. Use whatever it takes toalgorithmically distinguish between the green backdrop and the actors in theforeground. I’ve found that the following tricks are often necessary:Since we have a green screen, swap green and blue in Vlahos’ equation.Then replace the “blue” with max(red, blue).Always set bright ( 0.95) and dark ( 0.01) pixels to alpha 1.0 regardlessof what the equation computes.After computing alpha, increase its contrast with alpha (alpha – k3) * k4Tune the constants interactively—in onUserInput make different keys tochange different constants up and down, so that you can watch their changesin real time instead of recompiling your whole program.Multiply the final alpha channel by a hand-drawn “garbage matte” that isblack in areas that you always want to cut out and white in areas that maycontain foreground objects. Because the camera is stationary relative to theset, you can paint a single garbage matte for the whole video sequence.The basic idea behind making a light saber glow is to first create a matte for thesaber (which is the opposite of the process of creating a matte for the actor against agreen screen!), blur out the saber, and then add the blurred image back into theoriginal. For a more authentic appearance, combine a colored blur with a whitecore, as shown on these /3

Even if you have trouble with one of the effects, that should not prevent you fromimplementing later stages of the pipeline. Use Photoshop or the Gimp to manuallyimplement that effect (or produce appropriate input from a later stage from scratch)so that you can get points for most of the requirements.I gamma-correct after applying all other effects. But you can probably make it workin any order after Bayer; the light saber is a hack anyway.Sometimes the data just isn’t there in the input. If you can’t find a good set ofmatting constants for a data set, maybe that’s a bad video sequence. Try anotherone, or hand-draw a good sample image to start with when you’re debugging youralgorithms. Tweaking the constants for matting is a time consuming and frustratingstep, because there are so many of them and it is hard to get the foregroundseparated well.There are several ways of turning your frames into a movie. iMovie and Final CutPro are both available to you in the Unix and Very Special Purpose labs, and theOIT staff can help you with other methods.Innovate!Some of the video sequences include the use of a blaster rifle, which shoots at theJedi (he then deflects the blasts back at his attacker using his light saber). You canadd blaster shots by manually drawing them for each frame. That’s time consumingand may produce poor results, however. A better method is to write code that lightsup pixels (like your light saber effect) along the path of the blaster shotprogrammatically. You’ll have to specify the shot start and end points in your textfile, of course.Nearest neighbor demosaicing produces relatively poor results. This means thatyour matting algorithm will likely have bad results around the edges of objects andthat the images will look either slightly grainy or blurry. There are several bettermethods for demosaicing. Bilinear demosaicing interpolates the missing channelsfrom the eight neighbors at a pixel rather than just using one of them. Malvar etal.’s method performs more advanced filtering across a larger set of pixels.It is common to composite multiple layers of effects. For example, a film mightcontain a 3D rendered background, live action actors, and then weather effects likerain and smoke composited over the actors.Manual retouching is a fact of life in film production. Green screen mattes areregularly retouched in photoshop to fix locations where the algorithm was in error.In many cases, the wires holding up flying actors and space ships are manuallyedited out as well. You can retouch your frames and intermediate results for themovie (e.g., manually fixing matting errors) as long as you also produce a video thatis unmodified.4

Behind-the-scenes videos are a great way to demonstrate your work and will receiveextra credit. Instead of just a final result video, submit a video that shows thefollowing sequences in order:1. Final result2. Input3. after Bayer4. after Gamma5. after Matting6. after Compositing7. after Bloom8. after other Effects (final result again)There is great potential for artistic extra credit on this assignment. Here are just afew examples:You can implement filters of your own design, e.g., a median filter and blackoutlines around silhouettes will make the foreground look like a cartoon; you canthen composite it over a cartoon background.Create a number of different sequences and edit them together to tell a story. Adddialogue, sound effects, and music.Composite multiple “foreground” characters into a single scene. Characters don’thave to appear on screen in their original position—you can slide them and thebackground to make them walk or to match virtual camera movement.Flipping an image upside down and adding a faint version of it back into theoriginal simulates a reflective floor.Implement artificial depth-of-field effects to defocus the foreground or background.Changing focus between foreground and background (a rack focus effect) is a goodway of shifting the viewer’s attention.Although you can only work with at most one other student on the assignmentproper, you can carefully plan with another team to each produce different effectsfor different scenes and then combine your scenes into one extra-credit movie.Submitting Your Solution1. Put your name, e-mail address, and the name of the file in a doxygen commentat the top of each file.5

2. Create a “doc-files” directory that contains a readme.html file with yourname, e-mail address, partner’s name (if you worked in a pair) and anythingyou’d like to point out to me when I’m reviewing your program.a. If there are known bugs, extra credit features, or design points ofnote, list them here.b. Credit any code that you found on the internet.c. If you are using your 2-day grace period or a prearranged deadlineextension, explain that in the readme file.d. Explain the motivation for the algorithms that you developed.e. You must present an image sequence showing the initial input andthe output of each image processing stage applied to the sameframe, to prove that your algorithm is working correctly. This is thefirst place I’ll look when grading.3. Put your completed movies in the doc-files directory. Make sure that they playon the Mac in the Unix lab. Make sure your readme.html file explains whateach movie represents. Movies should be compressed down to fewer than 8MB each.4. Delete all generated files using “icompile --clean”. Do not hand in a build thisweek.5. Change to the parent directory of your project and run the FreeBSDcommand:submit371 fx6

3. after Bayer 4. after Gamma 5. after Matting 6. after Compositing 7. after Bloom 8. after other Effects (final result again) There is great potential for artistic extra credit on this assignment. Her