Read Template text version

SMPTE Technical Conference and Exhibition

Development of a Compact Motion Control Camera System for HD Digital Broadcasting

Hiroyasu Masuda1, Akira Akahoshi1, Tetsuaki Nakazawa1 Keita Kataoka1, Daiichiro Kato1 Tsuyoshi Ueyama2, Yoichirou Ito3

1 2

NHK (Japan Broadcasting Corporation), 2-2-1 Jinnan, Shibuya-ku, Tokyo 150-8001 Japan DENSO WAVE Inc., 1-1 Showa-cho, Kariya City, Aichi 448-8661 Japan 3 Nihon Fukushi Univ., 26-2 Higashihasemi-cho, Handa City, Aichi 475-0012 Japan

Abstract A motion control camera is one with a system for repeating camera operations identically by computer control. It is often used in cinema and other productions that require complex multiple compositions. NHK have applied state-of-the-art Japanese industrial robotic technologies to develop a motion control camera based on a new concept for use in HDTV production. It is compact thanks to the use of a small HD camera and has a convenient user interface. Compatibility of the camera data with the popular MAYA CG software makes shooting more efficient by enabling advance simulation of the robotic motion. The ease of reproducing identical camera operations reduces the time required for the multiple and CG compositions of VFX productions.

1. Introduction The motion control camera has a system for the identical repetition of camerawork by computer control. It can, for example, be used to convert a few extras into a crowd of thousands by multiplying images, or prepare a CG background for a real scene using the sequential camera data of the original shooting. Special operations of this kind require use of a very precise motion control camera with great repeatability and cannot be performed by manual camera work. The motion control camera has, therefore, come to be regarded as an essential system for VFX productions. NHK already use the existing motion control cameras in drama and other program productions, renting a large foreign made system because there is no Japanese equivalent. This has the disadvantage, however, of inefficiency for TV production due to the complexity of installation and need for special technicians to set the camera operations. We have now developed a motion control camera system that is easy to operate and compact enough for use in a confined TV studio. The design targets included: - A compact system capable of both 3m-high bird's eye and high-speed mobile shooting - Easy man-machine interface - Input/output data compatibility with MAYA CG software for simulation on a PC Japan's world-class robotic technologies were employed in the development of high precision and convenient operation and control.

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City


SMPTE Technical Conference and Exhibition 2. System Overview 2-1. Outline Figure 1 is an external view of our motion control camera system. The crane arm and camera mount are compact for the camera head to be operated in a confined studio set.

Figure1: External View of the System

Table 1 lists the specifications of the system, and Figure 2 shows the conceptual diagram. The hardware consists of three major components: an interface PC for robotic control, positional instructions and run commands; a control box, which houses servomotor drivers and a power supply unit; and a motion control crane. The crane has six camera motion axes (rotate (bottom and top of the arm), lift, pan, tilt and roll) plus the backward-and-forward axis of the crane along the track. A compact HD camera with 21x HD zoom lens was employed in view of use on confined studio sets. Zoom and focus are also controlled via the interface PC. The software consists of a man-machine interface (MMI) and a robot controller that generates run commands for the crane. The crane position for each axis is selected using joysticks. The positional control data are then inputted to the MMI via analog I/O. The motion trajectory (spline curve) 1 is generated from the starting point to the finish of robotic motion once all positional data have been inputted to the MMI for each given key frame. When a run command for motion control is sent from the MMI, the data pass to the robot controller for conversion into the control data of each frame. The motion controller inside the robot controller sends the command signal to each motor driver in the control box and each servomotor of the crane moves accordingly to achieve crane operation. Positional camera data are outputted in the MAYA 3D-CG software format for each frame. The system can, therefore, simulate the robotic motion in 3D on the interface PC before the shooting takes place.

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City


SMPTE Technical Conference and Exhibition The system is synchronized by black & burst signal (bi-level sync.) outputted from a sync-generator, while the HD camera and clapper are synchronized to HD Sync (tri-level sync.) outputted by HD sync-generator. The host Time Code Generator (TCG) is locked to the black & burst signal. The clapper and motion control system operate on the basis of that timeline.

Base Size Arm Length Total Weight Max. Lens Height Min. Lens Height Range of travel

1050mm(L) x 1050mm(W) x 1130mm(H)*1) 2300mm 500kg 3000mm -400mm *2) Track : 10000mm (Max) Rotate : +192deg / -132deg Lift : +110deg / -10deg

End of Arm : +149deg / -27deg Pan Tilt Roll Max Tracking Speed Max Arm Speed Weight capacity of arm : +166deg / -166deg : +125deg / -125deg : +164deg / -164deg 1500mm/sec 2000mm/sec 10kg

*1) From Ground to 2nd Joint Axis *2) Low Angle Setting Condition

Table1: System Specifications

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City


SMPTE Technical Conference and Exhibition

J3 J2 Controler & Power Supply J1 Servomotor Driver J6 J5


PLC (Sequencer)

Track Moving Axis Sync Trigger Clapper

Clapper Board

Robot Controller Input / Output

HD Sync Generator

Sync Generator Motion Controler Sync Timecode Input TC Timecode Generator


Remote Control for HD Camera

100Base-T Ethernet



Interface PC CPU: Robot Control Interface Man-Machine Interface Analog Digital Conv. Keyboard & LCD HD Video Recorder HD Monitor

Camera Control DATA for "MAYA"

JoyStick Controler

Figure 2: Conceptual Diagram for the System

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City


SMPTE Technical Conference and Exhibition Figure 3 is a flowchart of the software and Figure 4 shows an MMI display screen at data input. Coordinates (x, y, z) and angle data for each key frame provide the positional instructions. The optimal motion trajectory is calculated automatically by setting the length of transition between key frames. The coordinates and angle data can be edited as desired.

Operation Interface(Joystick) Capture operating angle PC Convert to speed command value Transmit data Capture key frame data Receive data ( Ethernet ) Receive data Calculate speed command value at camera coordinate Calculate position /speed command value for each axis Calculate position /speed command value for each axis Motion Control Motor drive Motor driver Motor Encoder Instruction replay both Transmit data Receive data Transmit data Data capture trigger Transmit data Calculate trajectory Convert axis data to camera coordinates Replay motion data Convert MAYA data MAYA data Operation result data Convert axis data to camera coordinates Recieve data

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City

Robot Controller


Capture position data for each axis

Capture position data for each axis

Figure 3: Flowchart of Software

Figure 4: MMI Display Screen


SMPTE Technical Conference and Exhibition 2-2. Downsizing the System The crane body is made compact by housing the power supply unit and servomotor drivers for controlling the crane's motion axes in a separate case. The crane arm was also shortened by employment of a rotational-drive system rather than the directly operated arm that requires a counterweight at the rear of the crane and longer arm length. The result is that the camera system is only half the size of the motion control cameras we have been using so far. As mentioned above, a compact HD camera was mounted at the top of the crane to permit shooting on a confined studio set.

1300mm 1050mm



1500mm Rail 800mm Top View Side View

Existing System Our System

Figure 5: Size Comparison with Existing Motion Control Cameras

2-3. Operational Interface Finding the Distance to the Object from Zoom & Focus Data It is necessary to input the distance (L) from the camera to the object as well as the camera position coordinates (x, y, z) in conventional motion control shooting to frame the object exactly in the desired position while the system is operating2. The precision of range finding has a direct impact on shooting accuracy. We have established a system that improves operating efficiency by calculating the distance from the zoom and focus data. Once a motion trajectory is generated, the zoom and focus data are stored together with the camera position coordinates for the given key frame. The distance from the object to the camera is then calculated automatically from the stored data. To calculate the distance to the object, the system adjusts the optical axis between the camera and lens and finds the point at which the angle encoded in the zoom and focus servo-controller corresponds to the actual zoom (angle of view) and focus (distance).


SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City


SMPTE Technical Conference and Exhibition





(Zoom', Focus')

(Zoom, Focus)

(x' , y' , z')

(x , y , z)

(x' , y' , z')

(x , y , z)

Existing System

Our System

Figure 6: Range Finding between Camera Head and Object


Joysticks Generally speaking, the camera's positional instructions for motion control systems are sent for each axis separately. In this new system, the camera position is controlled intuitively by means of joysticks. A gun-clip-like joystick controls the robotic arm, while the controls for pan, tilt and roll of the camera mount are integrated in a ball-type controller. A sliding controller is used for the track axis. The joysticks only function while a foot pedal is pressed, in order to prevent operational error. Figure 7: Joystick Controllers 2-4. Outputting MAYA Data Figure 8 shows a simulation on the screen of the PC interface. The output CGI data is in the major 3D-CG MAYA software format. The system is capable of importing CGI data from MAYA. Camera coordinate and angle data can be logged by time code during operation in addition to the current angle data for each axis. In simulation mode, the robot's motion can be simulated in 3D on the PC without moving the robot in fact. This enables us to confirm the correctness of robotic motion prior to shooting and the control data can be edited if necessary.

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City


SMPTE Technical Conference and Exhibition

Figure 8: Simulation on the PC Screen

3. Motion Evaluation Experiment We generated the motion trajectory illustrated below to evaluate the basic performance of our motion control camera system.

Target Y X Target Z Y Topview Backview

Figure 9: A Motion Trajectory for the Camera

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City


SMPTE Technical Conference and Exhibition







Fig.10: Command Data and Results

Fig.11: Error between Command Data and Results

The motion control camera was commanded to pan and lift for about 70 seconds. Zoom and focus are considered constant and the camera must retain its focus on the target. The experiment was conducted with positional instructions provided at three points, namely the start, middle and finish, as illustrated in Figure 9, to generate control data for three key frames. Figure 10 compares the command data (x, y, z) and operational results (x', y', z') with servomotors operating. All three axes moved accurately as commanded. Figure 11 illustrates the combined error for each axis shown in Figure 10. The maximal error is less than 4.5 mm, meaning that each servomotor functioned extremely well. We can say that our motion control camera operates as intended by the operator. Repeatability was also measured. The system was ordered to repeat a motion identically three times, and the resulting error in the spatial coordinates was less than 0.03 mm, confirming that the system does have sufficient repeatability. This is the most important practical performance item for a motion control camera. Figure 12 shows a change of commanded speed during motion control operation, and Figure 13 a commanded acceleration.

Fig.12: Commanded Speed

Fig.13: Commanded Acceleration

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City


SMPTE Technical Conference and Exhibition The system was commanded to converge movements once in the middle during this experiment. Operation was accordingly performed in two sections. Figure 13 shows that the system operated smoothly even at the transition point. Allowance was made for the jerk of the servomotor when generating the command signal. The jerk is due to the acceleration. Such allowance is also being introduced, for example, in elevator design to prevent discomfort for passengers. Use of this new method obtains smooth control of every system motion. Figure 14 shows a CG composition with images of an object repeatedly shot using this system. Still image (a) is captured from the first shot of the object, (b) from the second shot of the same object and (c) from an image composed from both stills and CG. We confirmed that the images could be overlaid completely without error.


(b) Figure 14: Shot Images and Compositional Result


4. Conclusions The motion control camera has been regarded as a tool for large-scale cinematic shooting rather than TV production in Japan. We have found it useful as a TV broadcaster, however, to possess a motion control camera for TV work and have now developed a system that is compact and easy to use. It achieves very high accuracy through the application of world class Japanese industrial robotic technology. We are now working on further system simplification, miniaturization of the control box and the use of fiber optic connections between the robot and controller. The motion control camera introduced in this paper should provide a basis for future systems. We wish to move forward now with the establishment of entirely new methods of robotic control and interface for much easier operation. We believe that the system will be utilized in various kinds of program production. We would like to thank Ikegami Tsushinki Co., Ltd. and Fujinon Inc. for their cooperation with our development project. References 1. Numerical Recipes in C, William H. Press, et al, 1992 2. Quaternions, Interpolation and Animation, Dam Erik B. etal, 1998 (

SMPTE Technical Conference and Exhibition November 9-12, 2005 New York City




10 pages

Report File (DMCA)

Our content is added by our users. We aim to remove reported files within 1 working day. Please use this link to notify us:

Report this file as copyright or inappropriate


You might also be interested in