Blog Post


In high school, we had a computer room full of Teletype terminals with 300-baud dial-up lines to the University of Delaware Project Delta -- a DEC pdp11/45 running RSTS -- real-time/time-sharing operating system. We wrote, ran and debugged programs interactively in BASIC. We had Kenneth Knowlton ASCII-Art pin-up calendars on the walls. We generated buckets of chads with the paper-tape punch. I did my science project on Electronic Music. I still have blueprints that Engineers sent me from Moog Music of old System-3 modules.

In 1974, programming at Lehigh University was FORTRAN punch-cards, an IBM batch system and Pdp8 Assembly Language -- "It's your PAL." It was like going back to the stone age. The Electronic Music Studio had a Buchla, which was really weird to play. A guy in my dorm had a Mini-Moog, which was much more intuitive to me. The Electrical Engineering labs had huge installations of dusty old power-conversion equipment -- motor/dynamos -- which had been obviously unused for years.

I dropped out of Lehigh in 1976 and worked at Rollins Broadcasting Station WAMS, 1380 KHz AM, 5,000 Watts. I obtained my FCC 3rd-Class Operators License and studied for the 1st-Class Engineers License.

In 1977, I started at the University of Delaware. While I was re-taking Calculus and Differential Equations, I studied Art & Technology -- a la Billy Kluever/Robert Rauschenberg's E.A.T. of 1962 -- at the Library. I had a "progressive" music program on the College radio station, WXDR 93.1FM-1,000 watts.

I took two Electrical Engineering required courses per semester and filled the rest with Art History, Drawing, Visual Design, Sculpture, Electronic Music, Psychology of Perception, Theater Technology, Physical Optics and Physical Acoustics. It was a generalized education in Art & Technology. I have had Color Theory from every possible perspective, including video engineering and computer graphics. I studied Electronic Music with Fred Hofstetter, Director of the Plato Project at the U-Del; I studied Sculpture with Joe Moss, Acoustical Sculptor and former Fellow of the MIT Center for Advanced Visual Studies. (C.A.V.S.)

The main product of my undergraduate career was United States Patent 4,248,120, "Stringed Musical Instrument with Electrical Feedback" and an article on the self-same Electroacoustic Monochord, published in "Perspectives of New Music", 1981-1982. I bought an old upright piano, disassembled it and did some work on creating a mechanically reorganized "Inifitely-Preparable" version. I designed and built some prototype circuits and mechanical assemblies for a polyphonic keyboard that measured inertia from piano hammers. It kept the feel of a piano without any electromechanical fiddling.

By the time I graduated with my Bachelors in Electrical Engineering, my professors were telling me that the TI DSP chip can synthesize a symphony orchestra in real-time, therefore music is no longer a hardware problem.

My first job was at AT&T Bell Laboratories, Naperville, IL. I spent a lot of time in the Bell Labs Technical Library. I researched my grandfather, who was a radio pioneer, audio engineer and supervisor of the sound stage in Brooklyn, NY where Vitaphone and later Western Electric sound-on-picture recording was developed. I found a copy of the original Carl Sagan Report to Congress on the "Possibility of Life Elsewhere in the Universe" -- the document that launched the Search for Extraterrestrial Intelligence. (S.E.T.I.) I researched digital holography, a la the MIT Media Lab, at the time.

I made 3D Books on the laser printers and photocopiers. The software was Brian Kernighan's 'pic' and 'troff' for Unix. I did computer graphics and optical film experiments on the generally-available graphics output devices: the 'STARE' electrostatic plotter, the CalComp flat-bed pen plotter; The Information International Inc. (III) FR80/ Comp80 vector film printer. The FR80 at Naperville shot hi-contrast black-and-white positive unperforated 35mm film. I made vector art by superimposing multiple exposures onto Ektachrome using differently colored Wratten Gels.

The FR80 had a huge addressable range (could make really smooth circles) and could resolve 33 lines per millimeter. I made diffraction gratings, Fresnel zone plates, simple, primitive digital holograms and I took optical measurements to verify the results.

The FR80 at Bell Labs in Holmdel, NJ had a 16mm cine camera head. All the graphics software for the common output devices at the time were in FORTRAN, but I edited and managed the files in Unix on a DEC Vax 11/780. There was a "virtual card reader" interface between the Vax and the IBM 370 batch system. I could submit computing jobs from my desk in Naperville and they would be transmitted over AT&T's private Satellite data network to run in Holmdel. I got 16mm film back in my mailbox several days later.

I kept in correspondence with Otto Piene, the Director of the MIT CAVS. I had applied to the Science Masters in Visual Studies Program in 1980. Otto told me, "We only accept rich kids, who can afford it." But, by 1983 or so, he had put me in touch with Joe Davis, a Fellow at the C.A.V.S. at the time. Joe was working on a lightning-activated sky-art project involving a tall aluminum tower to be placed on a "spoils island" in the Mississippi Delta. I designed a lightning-pumped laser to be installed on the tower. I created large-format drawings of the laser using the CalComp pen plotter.

I experimented with UUCP and UUX on the Bell Labs/Naperville Unix network. There was nosystem security in those days, so I could snoop around at will.

In 1984, Joe Davis invited me to visit the MIT C.A.V.S and lecture for his class. The trip coincided with a Conference on the study of Polyhedra at Smith College. I met Tom Banchoff at that conference.Later that year, I left AT&T to work for Mort and Millie Goldsholl and their son, Harry in Northfield, Illinois. On the advice of Guenther Tetz of Chicago, the Goldsholls had purchased one of Tom DeFanti's original GRASS "Circle Graphics Habitat" systems. They needed someone to program it.

This original GRASS system was composed of a Digital Equipment Corporation (DEC) pdp11/40 with 32 kilobytes of real, magnetic core memory and a 22-inch round Vector-General 'scope attached to it. This was the same type of system that Larry Cuba used to do the "Death Star" Hologram fly-through sequence for the original "Star Wars". GRASS, the software was written in pdp/11 Assembly Language (PAL). The system came with a printout of the source code. It still had Tom's Jim Morrison quotes in the comments.

Harry and I rigged up an animation camera controller to the pdp11's parallel port. The camera was a vintage 1911 Bell & Howell pin-registered 35mm cine camera with two-digit serial number and a hole in the side with a square shaft in it, where the hand-crank used to go. The 1,000-foot film magazine was the old "mouse-ears" type which sat on top of the camera. I learned how to load, shoot and develop my own animation pencil tests on the system. The GRASS software was an interactive interpreter which had a FORTRAN-like syntax. The Vector-General had a dial-and-button box which you could hook up to your program in software to translate, tumble and otherwise manipulate your graphics interactively. This was my first interactive computer graphics system.

You could hear the memory cores rattling on their wires as the system looped through its display list drawing your graphics. Tom had added some pretty freaky string-processing functions into GRASS, that FORTRAN lever learned about. I could write programs in GRASS that would draw "fractal trees" from Lindenmeier (L-System) strings. After our first on-line movable-head disk head-crash, we bought a removable disk cartridge drive to which to do backups.

In 1985, I attended my first ACM/SIGGRAPH conference. I discovered the reality of what Tom DeFanti. Larry Smarr, Donna Cox, et al. would later tout as the arguments in favor of visual computing. Visualization is a wide information pipe. Particularly in the hands of someone like Jim Blinn or Nelson Max. In the days when SIGGRAPH represented all of the research in computer graphics and visualization there was, I left the Electronic Theater (Film & Video Show) feeling that I had seen the frontier of human knowledge and that it had been made accessible to me.

Harry Goldsholl and I met with Bill Kovaks, Mark Sylvester and Roy Hall of Wavefront Technologies, Inc. We compared Wavefront with Alias and went with Wavefront. We became installed, licensed site #2. In October of that year, Harry and I went up to Santa Barbara to train with Bill, Mark, Roy and Richard ("R.T.") Taylor on the Wavefront system. We learned it from the people who wrote it. Harry donated the GRASS system to Tom Defanti's Electronic Visualization Lab (EVL) at the University of Illinois at Chicago in exchange for render time on Ralph Orlick's VAX 11/780.

Wavefront ran on the Silicon Graphics Integrated Raster Imaging System "I.R.I.S". The first one we bought was based on the Motorola MC68010 processor-on-a-chip and the Silicon Graphics Geometry Engine. It had around 8 megabytes of RAM, maybe 20 megaytes of disk and ran an adapation of AT&T Unix System-V. The SGI Geometry Engine had a broadcast-quality, genlocked component video output, so our paradigm changed from film output to video output. We bought a Lyon-Lamb single-frame video recording controller which worked with modification with a Sony 3/4" tape deck (animation test) and a Hitachi 1" open-reel video tape recorder, which we rented for final output.

The renderer in first releases of Wavefront had no texture mapping or transparency mapping. Therefore, I wrote C programs which generated objects procedurally which simulated texture and transparency mapping with polygons with different shader assignments.

Before too long, Harry bought a second and third SGI Iris system. I built the Local-Area Network (LAN) out of 600-ohm coaxial "Thick-Net", which had "vampire" transceivers that tapped into the center conductor through a 1/4" hole "cored" through the shield side of the cable. The vampire tap clamped around the cable with a bolt tightened with a 9/16"-inch wrench. We had a 300/122-baud modem for dialig up to the VAX at the EVL for setting up render jobs. I collected rendered sequences on 1/2-inch, 9-track data tape by traveling back and forth between Northfield and downtown Chicago.

I did e-mail via a dail-up connection to Wavefront in Santa Barbara. I also connected to Keith Goldfarb at the newly-founded Rhythm & Hues studio.

There were still occasions which called for film-output, so I worked up an RS232 serial interface to the Interactive Motion Control (I.M.C.) robotic motion-control camera rig of which Goldsholl Design and Fims owned at least two. On some salesman's advice, Harry bought a Control-Data Corp. (CDC) external SCSI disk drive which turned out not to be supported on the SGI Iris 2400. I spent several months (it seemed) attempting to label, partition and format that disk -- in consultation with the main device-driver guy from SGI. We backed that disk up onto a Kennedy 1/2-inch 9-track tape drive which held around 100 Megabytes on a 10-inch open reel.

I took the Mensa membership exam in 1986.   I passed with a 147.

In 1987, the bottom fell out of the Computer Graphics and Animation industry. Editel, Chicago closed. It looked bad. I called up Mark Sylvester from Wavefront and he told me that The Post Group in Hollywood was hiring. Joe Gareri and Linda Rheinstein hired me to replace Evan Ricks, who was leaving to found ReZ.n8 Productions with Paul Sidlo, who had dropped out of Cranston-Csuri Productions in Columbus, Ohio. Before I left Goldsholl Design and Film, Mort had given me a copy of an article on Stereolithography and Tom Banchoff had pointed me to David Hoffman's article in the Mathematical Intelligencer on their new embedded minimal surfaces.

The Post Group was installed, licensed site number 1 from Wavefront Technologies. The Post Group was in the process of transitioning to a 4-2-2 component digital (D-1) editing facility which was built while I worked there. In the Graphics Department we laid-off animation frames to the Abekas A60 D1 Digital (parallel-RAID) Disk Recorder over an Ethernet interface. Maury Rosenfeld was programmer of the Quantel Mirage at The Post Group before leaving to found Planet Blue. I developed an interface between the Cyberware Laboratory laser range-finding 3D scanning digitizer and the Quantel Mirage.

In 1989, I started working part-time as Unix system and network administrator for the Computer Lab of the Cal Arts Film School.

I contacted engineers at 3D Systems in Valencia, CA and began developing software to convert Wavefront OBJ files to the 'STL' files used in 3D printing data exchange.   I contacted Jim Hoffman at the GANG Lab, U-Mass, Amherst, who sent me a bunch of STL files containing Costa-Hoffman-Meeks minimal surfaces.  I developed software to create 'thickened' surfaces, symmetrically displaced in the normal direction from the original, infinitesimally-thin, single-sided surfaces.   I essentially re-invented Andrew Glassner's "Winged-Edge" models from Graphics Gems II. (1991)

There was a guy in Oxnard who made color holograms. I developed methods of rendering scenes in Wavefront for output as a color hologram. Bruno George was supervising effects for "Max Headroom" after having been on the optical crew for "Ghostbusters". Bill Villarreal also worked in The Post Group Graphics department. Bill had written the Cray interface to the laser printer at Robert Abel Associates. In the depths of the writer's strike we began work on a digital video-to-film transfer process, which came to be known as th Gemini Process, United States Patent 5,191,416 1989.

The Gemini Process was a collaboration between The Post Group and Pacific Title and Art Studio -- a Hollywood institution dating back to 1912. The optical printer lathe bed of this system had on it a 1911-vintage Bell & Howell 35mm stop-motion animation camera with a two-digit serial number, same as the one I ran at Goldsholl's.

So, I have effectively been an oprical printer camera operator since 1989, although I didn't join IATSE until 1996.

The Gemini Process lathe bed also had on it a Tektronix monochrome cathode-ray tube with an optically flat screen and a high-resolution phosphor coating on it.  It was driven at 1024-scanline resolution by the 10-bit DAC video controller of a Pixar Image Computer. (PIC)  The PIC had a high-speed interface on the backplane of a Sun-3 workstation.  The Sun had an Abekas A-60 CCIR601 4-2-2 hardware-RAID digital magnetic disk recorder on a parallel-SCSI interface.  Simon Carter at Abekas helped me with the 12-bit fixed-point arithmetic necessary to convert the 16-bits per pixel 4-2-2 YUV images coming off the A60's SCSI interface to 36-bits per pixel 4-4-4 RGB images in Pixar image memory.  Loren Carpenter at Pixar helped me parallelize the code in Channel-Processor (ChaP) Assembly language (ChAS) to process 16 pixels at a time in the PIC four-way parallel bit-slice processor.

Bill Virreal designed and built a cameral controller on an RS232 serial interface from the Sun.  The controller cleanly switched the R, G, B video compnent signals from the Pixar's video out to the Textronix monitor and the controller advanced the camera.   This happened in parallel, so that the system could image-process a frame of video at almost the rate that it took to expose the color separations.   The system exposed monochrom color separations on high-resolution black-and-white positive film.  Since the inter-positive is monochrome, like Technicolor, it is archival and will not fade with time.  The color separations were then re-composited onto color negative with color filters via optical printing at Pacific Title and Art Studio.

Under the direction of Bob ???? I ran gamma wedges through the system, using a film densitometer to measure the results.  The end result was a one-light printing process for digital effects which were produced in video-interactive time rather than film-effects time.   You could do zeroth-generation effects in the D1 editing suite on original film plates (via digital telecine) and the prints through the Gemini Process will color-match the original film and can be inter-cut into the original film.  Digital effects for "Ghost" (1990), Dozo: Don't Touch Me; Jeff Kleiser/Diana Walczak, SIGGRAPH 1990, "Freejack" (1992), "The Lawnmower Man" (1992) and Warlock II: The Armageddon (1993) went through the Gemini Process.

On "Lawnmower Man", in addition to digital film printing, I had been doing digital sculpture with the help of Brian Vandellyn Park in the Marketing Department of DTM Corporation in Austin, TX. Brian Park was the inventor of the Flogiston Chair, which was used in the scenes of The Lawnmower Man where "Jobe" (Jeff Fahey) is learning at an accelerated rate in Virtual Reality.

Brian put set dresser Jacqui Masson in touch with me. The production company, BenJade Films, rented four of my sculptures to use in the set for "Sebatsian Timms' office". 

The telekinetically flying dagger in the final scene of Warlock II: The Armageddon came from a Cyberware scan. The digital visual effects for that scene were edited in CCIR-601 (D-1) 4-2-2 digital video, mixed in the Abekas A-84 D-1 digital switcher and the dagger element (and several other objects) were animated and rendered in the Sony System G.  These objects went through my interface between the Cyberware scanner, the Wavefront Advanced Visualizer and the Sony System-G.

One of my co-workers at The Post Group also worked part-time evenings
and weekends at the Bodhi Tree on Melrose Avneue in West Hollywood.

In 1990, I borrowed a Personal Iris 4D/20G fro the Los Angeles Silicon Grpahics sales office, a demo license of Wavefront T.A.V from Santa Barbara and began rendering at home a proposal for an exhibition of mathematical sculpture via rapid prototyping.  Linda Rheinstein allowed me to print my digital images on the Iris Graphics ink-jet printers at Electric Paint, the Post Group's print graphics division.  I sent this proposal to Tom Defanti, chairman of the 1990 SIGGRAPH Special Prohects committee, who funded by first build of stereolithographs at the Hughes Opto-Electronics and Data Systems Division in El Segundo on an SLA-500 with a nearly 20-inch-cube build envelope.  I borrowed a cameraman from The Post Group and shot a Betacam tape of a set of four of the Fermat surfaces emerging from the SLA-500's resin vat.

I published articles on my exhibition proposal and sterelithogrpahs which had been done in the Silicon Graphics "Iris Universe" user's magazine.  This got me invited to speak at Imagina'91 in Monte-Carlo.  The other Americans there with me were Michael Kass from Apple Research and Paul Haeberli from Silicon Graphics.  I sent a copy of my exhibition proposal to Stephen Wolfram, who invited me to Champaign, IL for several weeks to develop illustrations for the Second Edition of the Mathematica Book, which coincided with release 2.0 of the software.  Cliff Pickover invited me to speak at the IBM Thomas J. Watson Research Center in White Plains, NY.  He invited me to speak at the American Physical Society Physics Computing '91 where he introduced me to Ivars Peterson, Mathematics writer for Science News.  Ivars whote an article on my work in the August 3, 1991 issue of Science News.

In 1992, Michael Scroggins, Instructor at the Cal Arts Film School and I wrote a proposal which was accepted by the Art and Vrtual Environments Programme at the Banff Centre for the Arts in Alberta, Canada.  Michael and I spent a total of about a month at the Banff Centre between 1992 and 1994 completing "The Topological Slide" -- an immersive experience of surfing on mathematical surfaces with a physical surfboard mechanical computer interface.  I was invited to speak and exhibit at the Third Inter-Symposium for the Electronic Arts in Sydney Australia.  I spent a total of over six weeks away from work in 1992 and people at the Post Group were getting right pissed-off at me.

In November of 1993, I started work for Paul Sidlo at ReZ.n8 Productions on the fifth floor of the historic El Capitan Theatre Building on Hollywood Boulevard at Highland.   Paul had been Creative director at Cranston-Csuri Productions in Columbus Ohio prior to the Computer Graphics Crash of 1987.  I was continuing to follow in the footsteps of Evan Ricks, who had been 3D graphics lead at The Post Group up to 1987, when he co-founded ReZn8 with Paul.  When I started at ReZn8, they were in full-bore preparation for the 1994 Winter Olympics at Trondheim, Norway.

January 17th, 1994, 04:30AM PST -- The Northridge Earthquake hit.   I lived in Thousand Oaks, a good 20 miles West of the epicenter.  We knew it was *The Big One*.   My father-in-law had helped us put an addition on our house some years earlier.   It involved an extension grafted onto the concrete slab, and a fairly complicated gabled-joint in the roof.  It held together.   It was the rainy season.   The roof didn't even leak afterwards.  Hollywood was a different story.  Sean O'Gara, programmer was at work at ReZn8 when the earthquake hit.  He was not in his office -- he was in the hallway -- and therefore did not get hit by falling bookshelves.  One of the producers, Cathy was an emotional basket-case.  She lived in one of those long, two-story wood-frame and stucco apartment buildings in Studio City.   It hadn't fallen, like many of its kind, but there were major cracks in all of the interior walls and she was pretty freaked-out.

The El Capitan Building -- steel-framed with brick veneer -- had banged together repeatedly with the building next to it,  breaking up much of the outer walls between them.  The fire-sprinkler water tank on the roof broke loose and ruptured.  The roof itself was compromised to the extent that water was running down all the interior walls by the time I made it in at 9:00AM.  All the SGI 4D-series computers had rolled off of their shelves in the computer room, but amazingly were longer than the distance to the wall, so they were all still suspended off the floor and had not fallen.

The task of the day was to move all of the machinery for the facility to people's houses nearby, where small networks were constructed in order to continue production work.   I got a pager and used my Ford Explorer for the move.  We accomplished the task by the end of the day.  The El Capitan Building was red-tagged by the City the next day.   Disney bought it in a fire-sale some years later and it was re-opened in time for the premier of "Dinosaur" in 1999.  ReZn8 moved into the Tenth floor of the CNN Building on Sunset and Cahuenga -- "Safest building in Hollywood", according to Paul.   It was built Japanese-style on big rubber hockey-pucks.  We had magnitude 3-4.5 aftershocks for a good six weeks after the main quake.

During the Summer of 1993, I had been working at home -- Stephen Wolfram/Joe Grohens had sent me cast-off computers from the Wolfram Research Graphics/Publications Departments -- an Apple Mac-IIe and a Silicon graphics Personal Iris 4D20G.  I had an ISDN Lan-Wan gateway through GTE California to a local computer guy in Thousand Oaks.  Jill Smith from Perth, Western Australia e-mailed me -- "Have you seen this NCSA Mosaic thing?"   By the Summer of 1994, I had registered the Internet Domain Name "" (it was free-of-charge in those days) and had an Httpd v1.0 web server up and running on my Personal Iris from home.   Paul Sidlo got absolutely livid with me when I got "" into NCSA's "What's New on the World-Wide Web" a month before "" made it in.

That year, I also consulted as Digital Foundryman for Carl Cheng's Los Angeles Metro Rail Percent-for-Art Green-Line Commuter Station installation at Marine Avenue in El Segundo.  It was across the street from TRW Aerospace, and the theme was "Space Station".   I obtained CDs containing Planetary Topographic Data Records of Mars and Venus which had been collected by the Viking and Magellan spacecrafts.  I ordered the CDs on-line.  Someone from NASA called me up to make sure that I had really meant to order them.  These were the early days of e-commerce.  I created eight-inch-square fragments of cratered planetary surface replicas, built in Stereolithography at Hughes Opto-Electronics and Data Systems Division in El Segundo, and Carl cast them into glazed concrete for inserting into the walls of the train station.

Paul Sidlo and I had written a NASA/Industry Technology Development grant based on ideas for serving LandSat Map Data and Digital Cinema data on-line.   Meanwhile, Carl Cheng and I had our own set of visits and tours at NASA/JPL/CalTech related to the LA Metro Rail project. 

In 1995, Internet Pronoia set in.  We had been attempting to sell our house in Thousand Oaks, CA on and off for about three years.  We sold it in the Spring of 1995, but had no plans for where to go next.  I had been developing interactive, client-server-based Internet applications for almost a year in my own, home-grown version of Dynamic HTML and NCSA Httpd version 1.0.  I had been seeding the search engines with and had been watching my server's page-hit statistics.   My statistics were pretty impressive for ISDN.  I had joined an on-line group called "The Creators", led by one Oyra Matrix of the Frontier Range -- the foothills of the Rocky Mountains near Denver, Colorado.

We moved in with my Father-in-Law, in his three-bedroom townhouse in Wilmington, Delaware.  I set up my computers in his basement and dining room.  My wife started working for her brother in his gourmet submarine  sandwich shop.  I played stay-at-home Dad.   I incorporated MathArt.  I attended local meetings on Technology Development.   I visited the Dupont SOMOS rapid-prototyping group and the lab doing research in assistive technology for the disabled at the Alfred I. duPont Institute.   Except for these two groups, I received blank stares from most of the people I talked to in Delaware.

ISDN service from GTE, California was a flat monthly rate per bandwidth tier.  Bell Atlantic, Delaware charged by the minute for line usage.  I spoke at a Ralph Nader Consumer Technology Project Workshop on ISDN tariffs in Washington, D.C.  I got a picture and half a page of text on the front page of the New York Times Monday Business Section.   I sold an image for the cover of a math textbook by W. C. Brown, Publishers.

One day in January, 1996, I got a call from the Office Manager at ReZn8 Productions:   Marlon Brando had called the office looking for me.   He left a number.  When I returned the call, his assistant put me through.   Marlon had found me on-line, searching for makers of geodesic structures.  He wanted to build an underwater observatory and shelter from Category-IV hurricanes to be moored in the lagoon of Tetiaroa, his island in Tahiti.   I put in a call to Bruce Beasley, a pioneer in casting polyester bathyspheres and aquarium tank walls.  I had organized a panel session on digital sculpture at SIGGRAPH 1991, Las Vegas.  Bruce Beasley was a member of that panel.  What Marlon needed was not so much a designer of geodesic structures as an underwater structural engineer.

In the Spring of 1996, every major film studio in Hollywood decided that they needed to build digital production departments at the same time.   They hired up all available talent in the world.   I went to work at Walt Disney Feature Animation on the "Dinosaur" production with a $10,000 signing bonus plus moving expenses.  We bought a house in the Bird Streets -- the old neighborhood, which dates from the 1930's -- of Calabasas 91302 -- second most popular zip code after 90210.

Marlon Brando continued to call me up at work and bend my ear on the telephone for hours on end for the next two years.  I finally convinced him to have a face-to-face meeting.   It was one weekend at his house.  He warmed the swimming pool up to 90 degrees and it was the kids' pool party day at Marlon Brando's house while the adults sat indoors and talked.  He was really in no better financial shape than I was, owing to his many familial hangers-on.   He mainly wanted to trade his name for things like power generation and storage systems for his island.   I was in no position to work for free.

When I started working for Disney Feature Animation, the "Dinosaur" production was housed in an Imagineering building on Flower Street in Glendale.  Alan Kay (of Xerox PARC) and W. Daniel ("Danny") Hillis were Corporate Fellows -- Vice Presidents of Creative Technology at Walt Disney Imagineering.  Isaac Victor Kerlow, of the ACM SIGGRAPH organization was a Digital Talent Scout at the Walt Disney Studio.  Because the Disney Fellows were charged with consulting the Company on Creative Technology, I could arrange tours of the "Dinosaur" production for them  where I could also have face-to-face meetings where I could show them my digital/mathematical sculpture.

Shortly after Danny's tour, he called me up saying that I'm the only person he could think of who can make a mechanical part from a mathematical equation.  The result was the Equation of Time Cam for the Clock of the Long Now.   We had a meeting over in his office at Imagineering R&D where we exchanged book signings:  I signed his copy of Stephen Wolfram's Mathematics, Second Edition and he signed my copy of  The Clock of the Long Now: Time and Responsibility by Stewart Brand.

Everyone on the "DINOSAUR" production tacitly knew that there would not be enough work to support all of us after the show wrapped.   The layoffs began as "Tarzan" wrapped.  I was able to stay long enough to work on "Magic Lamp" for Tokyo DisneySea, "Reign of Fire" for Touchstone Pictures, the ill-fated "Wild Life" and I designed, implemented and supported the four-camera, stereo + 2X peripheral virtual camera rig in Maya for "Mickey's Philharmagic", Walt Disney World Magic Kingdom.  There is nothing quite like the thrill of seeing four 70-mm film projectors edge-match to within 1/2-pixel on a 140-foot-wide panoramic screen.

In the Spring of 2002, a head-hunter from Oak Ridge, Tennessee found me.


No comments