Home

Documentation

Download:
   Windows
   Macintosh
   Linux

New in VPython 6

Change log

User forum

Contributed programs

For developers

Python web site

 

Frequently Asked Questions

height field

This is documentation for Classic VPython (VPython 6), which continues to be available but is no longer supported. See vpython.org for information on installing VPython 7 or using GlowScript VPython. Documentation is available at glowscript.org by clicking Help.

Q: Is there a way to run a VPython program in a browser web page?

A: Python itself does not run in a browser, so this is not possible. However, there is a similar 3D programming environment called GlowScript (glowscript.org) which makes it easy to write 3D animations that run in a browser web page. Here is an overview of GlowScript.

Q: Is there a way to generate POV-Ray source files automatically from a VPython program?

A: Yes, the povexport module in the Contributed programs will do this.

Q: Is there a way to capture VPython graphical output as a movie?

A: There is no built-in feature to do this. Jay Wang has posted a detailed description of how to make a movie, along with many interesting examples.

Martin Ligare has posted a simple scheme that can be used on Linux.

On Windows, Taksi is an excellent, easy to use freeware video capture tool for Windows. It produces avi files which are playable not only on Windows but also on the Mac with QuickTime.

CamStudio is also a good freeware program for capturing to avi format on Windows; at one time the help menu didn't seem to work, but you can get help from the start menu entry for CamStudio. For capturing VPython animations you probably want to choose the menu option "Region" in which case when you start recording it waits for you to draw a capture rectangle.

A screen and audio capture tool for Linux is vokoscreen.

A good shareware utility for Windows is Snagit (www.techsmith.com, $40). Let us know of other utilities you have used. Or google "screen capture utilities".

In the contributed programs is a program by Kelvin Chu for creating a QuickTime movie on MacOSX.

From Ruth Chabay: I use a somewhat complex method to make large, high-quality movies from VPython animations, that can include sophisticated effects such as shadows, transparency, refraction, etc. It involves several steps.

1) Import the module “povexport” at the beginning of your program (available at vpython.org). At intervals, call povexport.export() to export the display as a PovRay scene description file. I put a counter in my program, and use it to name the exported files sequentially, e.g. “myanimation001.pov”, etc.

2) I use Pov-Ray, a very sophisticated free raytracer that runs on all platforms, to render all the files. This can be done in batch mode, using the Queue. I set PovRay to output Targa files (.tga, raw rgb values), with anti-aliasing turned on. I choose the size of the output files to correspond to the size of the movie I want.

3) Targa files are large, so I convert them to jpg files. I have been using Photoshop, but it would probably be easier to do this with Python Image Library (PIL).

4) To assemble the numbered files into a movie, the simplest and best tool I’ve found (on Windows) is QuickTime Pro, which costs $30.00. I find that a frame rate of 10 frames per second works well for computer-generated movies. (Fewer frames/second is jerky; more frames/second just makes the movie take up more disk space). You may have another favorite tool – I don’t recommend Premiere for this, because its orientation to actual video makes it difficult to produce something at a wildly different frame rate.

That’s it. The “3D Ball and Spring Model of a Solid” movie was produced this way.

Michael Cobb produced a video of a very lengthy VPython computation (see contributed programs) by doing this: "On Windows the Python Imaging Library (PIL) has a grab image function that I used periodically to capture (screen dump) a pre-defined section of the screen and make a jpg. I then used  Gimp to encode all the files (933 of them) together into an avi file. I wish the ImageGrab function worked on Ubuntu as computations are about 2-3 times faster than on Windows." This is adequate for his purposes, but of course lower quality than the raytraced images produced using PovRay. Details:

import Image # from PIL
import ImageGrab # from PIL
....
im = ImageGrab.grab((0,0,600,600)) # if window located at (0,0), width=height=600
im.save(filename.jpg) # where filename is incremented for each image

Q: Is there a way to create a stand-alone VPython application?

A: From Andrei Makhanov, for Windows:

1) To compile a vPython program to exe, you need to download py2exe and install it.
    There exists a Python 2.7.3 version.

2) py2exe require msvcp90.dll to be installed, which is a Microsoft distribution.
    Find the .dll online and put it into C:\Python27\DLLs.

3) Edit C:\\Python27\Lib\site-packages\visual_common\materials.py with VIDLE.

4) In materials.py, find texturepath. Find where it says "visual\\" and change it to "".

5) Create a file called setup.py containing the following (change the name "stars.py" to your py file):

from distutils.core import setup
import py2exe

setup(
console = ["stars.py"],
)

Alternatively, you can use the file setup.py, in which case you don't need to perform step 8 below. You will need to change the "script" option to replace the name "stars.py" with the name of your own file.

6) Place the files setup.py and stars.py (or your file name) in a folder that's easy to get to.

7) In a command prompt, go to this folder and execute the following code:
    C:\Python27\python setup.py py2exe

8) From C:\\Python27\Lib\site-packages\visual_common copy the following files to your folder:
    turbulence3.tga, wood.tga, BlueMarble.tga, brickbump.tga, earth.tga, and random.tga.

9) The program should now execute and work.

Q: What stereo glasses should I buy to use with scene.stereo?

A from Bruce Sherwood: The cheapest reasonably good scheme is red-cyan glasses, with the red lens on the left eye (scene.stereo = 'redcyan'). Google red-cyan stereo glasses for options; cost is less than a dollar each. For occasional use I find the handheld variety to be preferable to ones with earpieces, because the flat handheld glasses are easier to store, hand out, and retrieve. Red-cyan is far preferable to the older red-blue (scene.stereo = 'redblue'), because red-blue scenes are essentially monochrome (red or blue), with ugly magenta in overlap regions, whereas red-cyan permits full color (albeit pastel), and overlap regions are white.

Red-cyan glasses can be used with any computer, including laptops or computer projectors. No special graphics card is required. Colors are not true due to the necessity of adding some white to pure colors in order to get stereo. For example, a pure red sphere would provide no image for the right (cyan) eye, so some white is added to the red to make a pink. The effect is that all colors are pastel. Another disadvantage of red-cyan glasses is that there is some bleed-through of the red or left image through the cyan filter to the right eye, and some bleed-through of the cyan or right image through the red filter to the left eye. This is probably unavoidable, because not only are the cheap filters not perfect, but the standard red-green-blue emitters used in displays are not pure red, green, and blue but contain some colors in the other regions of the spectrum. Nevertheless, the stereo effect with red-cyan glasses is quite striking, no special graphics equipment is needed, and the price is right.

Another option is to use side-by-side stereo (scene.stereo = 'passive') with relatively inexpensive stereoscopic viewers such as those available at http://www.berezin.com/3d/viewers1.htm. As with red-cyan stereo, no special computer equipment is required.

Some people are able to train themselves to see small stereo scenes with no glasses (scene.stereo = 'crosseyed'). Put your finger between your eyes and the screen and focus on the finger. Move the finger toward or away from you until the two screen images merge. Then, without changing the directions your eyes are pointing, change the focus to the screen, and you'll see a full stereo view. Similarly, if you're able to look "walleyed" (eyes pointing nearly parallel, to the far distance, but focussed on the screen), you can see stereo for small scenes using scene.stereo = 'passive'.

With all of these scheme, the effect is enhanced by rotating the scene as you view it.

Quad-buffered stereo

Due to a bug, quad-buffered stereo did not work with VPython 5.x previous to 5.41.

A good option for showing high-quality stereo (scene.stereo = 'passive') to large groups is to buy an appropriate "quad-buffered" graphics card such as the NIVIDA Quadro series that can present the left and right views to two side-by-side (or over and under) computer projectors, each with a polarizer, projecting onto a special non-depolarizing (metallic) screen. The audience wears polarizing glasses (again, you can find these from the same sources identified through Google, and these glasses are only about 50 cents each). Ordinary screens don't work, because the polarization is destroyed on reflection. For a lot of detail on this option, Google Geowall, a consortium of people using this option in geography research and education. Polarization can be either linear (horizontal and vertical) or circular (left and right circular polarization); you need different polarizers and different glasses for the two schemes. A minor disadvantage of the linear polarization scheme is that the stereo effect is more easily disturbed when you tip your head.

What is "quad-buffered"? A standard graphics card is "double-buffered": it holds an image in one buffer and continually hands it to the display to refresh the screen. At the same time in a second buffer the card can be accepting from the computer the creation of a new image. Upon completion of drawing the new image (in the case of VPython, using OpenGL to create that new image), the card switches to refreshing the screen from the second buffer. A quad-buffered graphics card has two double buffers ("quad"), one for the left image and one for the right. It can give two computer projectors left and right images.

With a quad-buffered graphics card, "shutter glasses", and a fast 120 Hz display, you can achieve very high-quality stereo on a computer (scene.stereo = 'active'). Shutter glasses alternate opaque and transparent states of the lenses in front of the left and right eyes, so that any instant you only see the view appropriate to the eye. To avoid flicker, ideally the display should run at 100 Hz or more (50 or more images for the left eye per second), which is why this scheme isn't good with most displays that run at only 60 Hz. The graphics card must also furnish (as quad-buffered cards normally do) a synchronization signal to the shutter glasses to switch view. This can be infrared (wireless shutter glasses) or wired; Google "shutter glasses".

Important: The NVIDIA "3D Vision" GeForce graphics cards do NOT currently support OpenGL stereo (they only support Microsoft DirectX stereo applications). Only the NVIDIA Quadro cards can be used for VPython active stereo.