Q: Is there a way to generate POV-Ray source files automatically from a VPython program?
A: Yes, the povexport module in the Contributed programs will do this.
Q: Is there a way to capture VPython graphical output as a movie?
A: There is no built-in feature to do this. In the contributed programs is a program by Kelvin Chu for creating a QuickTime movie on MacOSX.
CamStudio is a good freeware program for capturing to avi format on Windows (download at http://www.freedownloadscenter.com/Reviews/r1075.html); at one time the help menu didn't seem to work, but you can get help from the start menu entry for CamStudio. For capturing VPython animations you probably want to choose the menu option "Region" in which case when you start recording it waits for you to draw a capture rectangle.
A good shareware utility for Windows is Snagit (www.techsmith.com, $40). Let us know of other utilities you have used. Or google "screen capture utilities".
From Ruth Chabay: I use a somewhat complex method to make large, high-quality movies from VPython animations, that can include sophisticated effects such as shadows, transparency, refraction, etc. It involves several steps.
1) Import the module “povexport” at the beginning of your program (available at vpython.org). At intervals, call povexport.export() to export the display as a PovRay scene description file. I put a counter in my program, and use it to name the exported files sequentially, e.g. “myanimation001.pov”, etc.
2) I use Pov-Ray, a very sophisticated free raytracer that runs on all platforms, to render all the files. This can be done in batch mode, using the Queue. I set PovRay to output Targa files (.tga, raw rgb values), with anti-aliasing turned on. I choose the size of the output files to correspond to the size of the movie I want.
3) Targa files are large, so I convert them to jpg files. I have been using Photoshop, but it would probably be easier to do this with Python Image Library (PIL).
4) To assemble the numbered files into a movie, the simplest and best tool I’ve found (on Windows) is QuickTime Pro, which costs $30.00. I find that a frame rate of 10 frames per second works well for computer-generated movies. (Fewer frames/second is jerky; more frames/second just makes the movie take up more disk space). You may have another favorite tool – I don’t recommend Premiere for this, because its orientation to actual video makes it difficult to produce something at a wildly different frame rate.
That’s it. The “3D Ball and Spring Model of a Solid” movie on the Matter & Interactions website was produced this way.
Michael Cobb produced a video of a very lengthy VPython computation (see contributed programs) by doing this: "On Windows the Python Imaging Library (PIL) has a grab image function that I used periodically to capture (screen dump) a pre-defined section of the screen and make a jpg. I then used Gimp to encode all the files (933 of them) together into an avi file. I wish the ImageGrab function worked on Ubuntu as computations are about 2-3 times faster than on Windows." This is adequate for his purposes, but of course lower quality than the raytraced images produced using PovRay. Details:
import Image # from PIL
import ImageGrab # from PIL
im = ImageGrab.grab((0,0,600,600)) # if window located at (0,0), width=height=600
im.save(filename.jpg) # where filename is incremented for each image
Q: Is there a way to create a stand-alone VPython application?
A: From David Andersen: Using py2exe (http://sourceforge.net/projects/py2exe/) I was able to build and run "stars.exe". I used the following "setup.py" file and ran "python setup.py py2exe" - the build process gave a warning about a missing "dotblas" (part of the Numeric package I believe), but the resulting executeable ran with no problem. I also built "stonehenge.exe" - it also worked.
--- file setup.py
from distutils.core import setup
console = ["stars.py"],
What py2exe does is to pull together the Python modules and dynamic link libraries used by a Python program, put them all in a "dist" directory, and build a stub executeable (which I believe contains the Python source for the target program). A listing of this directory for "stars.py" follows. The "library.zip" file contains library Python modules used by the Python program.
Directory of C:\dist
12/10/2003 11:56 PM <DIR> .
12/10/2003 11:56 PM <DIR> ..
10/02/2003 08:03 PM 57,400 _sre.pyd
10/05/2003 11:48 PM 757,816 cvisual.dll
05/16/2003 04:54 PM 61,440 umath.pyd
05/16/2003 04:54 PM 69,632 _numpy.pyd
05/16/2003 04:54 PM 36,864 multiarray.pyd
12/10/2003 11:56 PM 679,991 library.zip
10/02/2003 08:02 PM 974,908 python23.dll
12/12/2002 12:14 AM 257,536 DDRAW.dll
08/23/2001 12:00 PM 116,736 GLU32.dll
08/29/2002 05:41 AM 686,080 OPENGL32.dll
12/10/2003 11:56 PM 24,576 stars.exe
11 File(s) 3,722,979 bytes
Q: What stereo glasses should I buy to use with scene.stereo?
A from Bruce Sherwood: The cheapest reasonably good scheme is red-cyan glasses, with the red lens on the left eye (scene.stereo = 'redcyan'). Google red-cyan stereo glasses for options; cost is about 50 cents each. For occasional use I find the handheld variety to be preferable to ones with earpieces, because the flat handheld glasses are easier to store, hand out, and retrieve. Red-cyan is far preferable to the older red-blue (scene.stereo = 'redblue'), because red-blue scenes are essentially monochrome (red or blue), with ugly magenta in overlap regions, whereas red-cyan permits full color (albeit pastel), and overlap regions are white.
Red-cyan glasses can be used with any computer, including laptops or computer projectors. No special graphics card is required. Colors are not true due to the necessity of adding some white to pure colors in order to get stereo. For example, a pure red sphere would provide no image for the right (cyan) eye, so some white is added to the red to make a pink. The effect is that all colors are pastel. Another disadvantage of red-cyan glasses is that there is some bleed-through of the red or left image through the cyan filter to the right eye, and some bleed-through of the cyan or right image through the red filter to the left eye. This is probably unavoidable, because not only are the cheap filters not perfect, but the standard red-green-blue emitters used in displays are not pure red, green, and blue but contain some colors in the other regions of the spectrum. Nevertheless, the stereo effect with red-cyan glasses is quite striking, no special graphics equipment is needed, and the price is right.
A good option for showing high-quality stereo (scene.stereo = 'passive') to large groups is to buy an appropriate "quad-buffered" graphics card that can present the left and right views to two side-by-side (or over and under) computer projectors, each with a polarizer, projecting onto a special non-depolarizing (metallic) screen. The audience wears polarizing glasses (again, you can find these from the same sources identified through Google, and these glasses are only about 50 cents each). Ordinary screens don't work, because the polarization is destroyed on reflection. For a lot of detail on this option, Google Geowall, a consortium of people using this option in geography research and education. Polarization can be either linear (horizontal and vertical) or circular (left and right circular polarization); you need different polarizers and different glasses for the two schemes. A minor disadvantage of the linear polarization scheme is that the stereo effect is more easily disturbed when you tip your head.
What is "quad-buffered"? A standard graphics card is "double-buffered": it holds an image in one buffer and continually hands it to the display to refresh the screen. At the same time in a second buffer the card can be accepting from the computer the creation of a new image. Upon completion of drawing the new image (in the case of VPython, using OpenGL to create that new image), the card switches to refreshing the screen from the second buffer. A quad-buffered graphics card has two double buffers ("quad"), one for the left image and one for the right. It can give two computer projectors left and right images.
With a quad-buffered graphics card, "shutter glasses", and a CRT rather than flat panel display, you can achieve very high-quality stereo on a computer (scene.stereo = 'active'). Shutter glasses alternate opaque and transparent states of the lenses in front of the left and right eyes, so that any instant you only see the view appropriate to the eye. To avoid flicker, ideally the display should run at 100 Hz or more (50 or more images for the left eye per second), which is why this scheme isn't good with flat-panel displays that run at only 60 Hz, though I get rather good displays on a 75 Hz monitor. The graphics card must also furnish (as quad-buffered cards normally do) a synchronization signal to the shutter glasses to switch view. This can be infrared (wireless shutter glasses) or wired; Google shutter glasses.
Some people are able to train themselves to see small stereo scenes with no glasses (scene.stereo = 'crosseyed'). Put your finger between your eyes and the screen and focus on the finger. Move the finger toward or away from you until the two screen images merge. Then, without changing the directions your eyes are pointing, change the focus to the screen, and you'll see a full stereo view. Similarly, if you're able to look "walleyed" (eyes pointing nearly parallel, to the far distance, but focussed on the screen), you can see stereo for small scenes using scene.stereo = 'passive'.
With all of these scheme, the effect is enhanced by rotating the scene as you view it.
[Vpthon home page]