At this point I have an audio track (usually still without music) and all of my scenes and photography stuff rendered and ready to be compiled within After Effects. It’s within AE that I can crossfade scenes, add any additional video or image elements, warp text or images, create complementary transitions between scenes or pictures, create a dome mask, title sequences, etc.
Before I import any sequences or do anything within After Effects, I make sure I set up and check all of my preferences for this particular project. It’s going to be a 16 bit project with audio set to 24 bit, 44.1 khz; the frame rate is going to be 30fps; the pixel dimensions are 4K for the dome, which is 4096×4096. This will be the master composition where everything shall be joined together. And with all that set up, I can begin importing the scene sequences.
I don’t make mp4s or video files for each scene. I render each scene into a frame sequence. Sometimes I’ll make a video within a scene and I might make that into something like an .mp4, but for the most part I’m strictly dealing with frame sequences.
It’s at this point where I like to add fly-out images to the show.
For example, if I mention the Helix Nebula can be found in your night sky, I’ll bring up the Aquarius constellation the Helix Nebula is found in, zoom up an image of the Helix Nebula from the very point of its actual location, then zoom it back out into its origin.
This can be done within Digital Sky, but it’s far easier to do in After Effects. I can even easily add any spin to the image, or maybe materialize the image out of dust, or even blow up the image if I wanted to. Adding text, images, and video while in After Effects is far easier than dealing with it in Digital Sky.
That being said, I still have to add those images I’m using in the automated show to the Live buttons for our live shows. So, either way I look at it, I’m going to be adding the images to Digital Sky to be viewed later. It’ll just look slightly different.
Once I’m getting the scenes looking the way I want, I will render myself a small video before I leave for work. That way, when I get back in the next day, I’ll have something to look at and judge on what needs to be adjusted, added, or removed.
Once everything is compiled, crossfaded, added, adjusted, masked, and set in stone, then it’s time to do the final render.
I finally export the master frame sequence out of After Effects and into our NAS server where it will wait to be sliced. After this is done I have to make a small mp3 video for myself so I can take it into my audio workstation to work on surround sound effects.
Every movie I put on the dome has to be sliced. We have multiple projectors that work together to display the image you see on the dome. Each projector gets just one part of the final image. The piece the projector is getting is called a slice.
Next up, I’ll tell you how I slice a final frame sequence to be played on multiple projectors.
Comments