In visual effects, match moving is a technique that allows the insertion of 2D elements, other live action elements or CG computer graphics into live-action footage with correct position, scale, orientation, and motion relative to the photographed objects in the shot.
Also referred to as motion tracking or camera solving, match moving is related to rotoscoping and photogrammetry.
Match moving, by contrast, is typically a software-based technology, applied after the fact to normal footage recorded in uncontrolled environments with an ordinary camera.
As it is mostly software-based, match moving has become increasingly affordable as the cost of computer power has declined; it is now an established visual-effects tool and is even used in live television broadcasts as part of providing effects such as the yellow virtual down-line in American football.
A feature is a specific point in the image that a tracking algorithm can lock onto and follow through multiple frames (SynthEyes calls them blips).
In reality errors introduced to the tracking process require a more statistical approach to determining a good camera vector for each frame, optimization algorithms and bundle block adjustment are often utilized.
Unfortunately there are so many elements to a camera vector that when every parameter is free we still might not be able to narrow F down to a single possibility no matter how many features we track.
Two dimensional match moving only tracks features in two-dimensional space, without any concern to camera movement or distortion.
This technique is sufficient to create realistic effects when the original footage does not include major changes in camera perspective.
These tools allow users to derive camera movement and other relative motion from arbitrary footage.
The tracking information can be transferred to computer graphics software and used to animate virtual cameras and simulated objects.
Programs capable of 3-D match moving include: There are two methods by which motion information can be extracted from an image.
This tracking method also suffers when a shot contains a large amount of motion blur, making the small details it needs harder to distinguish.
The disadvantage of interactive tracking is that the user will inevitably introduce small errors as they follow objects through the scene, which can lead to what is called "drift".
Tracking mattes are also employed to cover areas of the shot which contain moving elements such as an actor or a spinning ceiling fan.
This has the benefit of helping the director and actors improve performances by actually seeing set extensions or CGI characters whilst (or shortly after) they do a take.
Eye-line references, actor positioning, and CGI interaction can now be done live on-set giving everyone confidence that the shot is correct and going to work in the final composite.
Software collects all of the 360 degrees of freedom movement of the camera as well as metadata such as zoom, focus, iris and shutter elements from many different types of hardware devices, ranging from motion capture systems such as active LED marker based system from PhaseSpace, passive systems such as Motion Analysis or Vicon, to rotary encoders fitted to camera cranes and dollies such as Technocranes and Fisher Dollies, or inertia & gyroscopic sensors mounted directly to the camera.
Encoders on the crane can also be used in real time on-set to reverse this process to generate live 3D cameras.
The data can be sent to any number of different 3D applications, allowing 3D artists to modify their CGI elements live on set as well.
Real time motion capture systems can also be mixed within camera data stream allowing virtual characters to be inserted into live shots on-set.
This dramatically improves the interaction between real and non-real MoCap driven characters as both plate and CGI performances can be choreographed together.