Development for Multisensor and Virtual Simulator–Based Automatic Broadcast Shooting System
To solve the limitations of complexity and repeatability of existing broadcast filming systems, a new broadcast filming system was developed. In particular, for Korean music broadcasts, the shooting sequence is stage and lighting installation, rehearsal, lighting effect production, and main shooting...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2022-01-01
|
Series: | International Journal of Digital Multimedia Broadcasting |
Online Access: | http://dx.doi.org/10.1155/2022/2724804 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832551167495766016 |
---|---|
author | Wonjun Lee Hyung-Jun Lim Mun Sang Kim |
author_facet | Wonjun Lee Hyung-Jun Lim Mun Sang Kim |
author_sort | Wonjun Lee |
collection | DOAJ |
description | To solve the limitations of complexity and repeatability of existing broadcast filming systems, a new broadcast filming system was developed. In particular, for Korean music broadcasts, the shooting sequence is stage and lighting installation, rehearsal, lighting effect production, and main shooting; however, this sequence is complex and involves multiple people. We developed an automatic shooting system that can produce the same effect as the sequence with a minimum number of people as the era of un-tact has emerged because of COVID-19. The developed system comprises a simulator. After developing a stage using the simulator, during rehearsal, dancers’ movements are acquired using UWB and two-dimensional (2D) LiDAR sensors. By inserting acquired movement data in the developed stage, a camera effect is produced using a virtual camera installed in the developed simulator. The camera effect comprises pan, tilt, and zoom, and a camera director creates lightning effects while evaluating the movements of virtual dancers on the virtual stage. In this study, four cameras were used, three of which were used for camera pan, tilt, and zoom control, and the fourth was used as a fixed camera for a full shot. Video shooting is performed according to the pan, tilt, and zoom values of the three cameras and switcher data. Only the video of dancers recorded during rehearsal and that produced by the lighting director via the existing broadcast filming process is overlapped in the developed simulator to assess lighting effects. The lighting director assesses the overlapping video and then corrects parts that require to be corrected or emphasized. The abovementioned method produced better lighting effects optimized for music and choreography compared to existing lighting effect production methods. Finally, the performance and lighting effects of the developed simulator and system were confirmed by shooting using K-pop using the pan, tilt, and zoom control plan, switcher sequence, and lighting effects of the selected camera. |
format | Article |
id | doaj-art-1b0ff1b6e1e542549ce38ee9defba01a |
institution | Kabale University |
issn | 1687-7586 |
language | English |
publishDate | 2022-01-01 |
publisher | Wiley |
record_format | Article |
series | International Journal of Digital Multimedia Broadcasting |
spelling | doaj-art-1b0ff1b6e1e542549ce38ee9defba01a2025-02-03T06:04:43ZengWileyInternational Journal of Digital Multimedia Broadcasting1687-75862022-01-01202210.1155/2022/2724804Development for Multisensor and Virtual Simulator–Based Automatic Broadcast Shooting SystemWonjun Lee0Hyung-Jun Lim1Mun Sang Kim2School of Integrated TechnologySchool of Integrated TechnologySchool of Integrated TechnologyTo solve the limitations of complexity and repeatability of existing broadcast filming systems, a new broadcast filming system was developed. In particular, for Korean music broadcasts, the shooting sequence is stage and lighting installation, rehearsal, lighting effect production, and main shooting; however, this sequence is complex and involves multiple people. We developed an automatic shooting system that can produce the same effect as the sequence with a minimum number of people as the era of un-tact has emerged because of COVID-19. The developed system comprises a simulator. After developing a stage using the simulator, during rehearsal, dancers’ movements are acquired using UWB and two-dimensional (2D) LiDAR sensors. By inserting acquired movement data in the developed stage, a camera effect is produced using a virtual camera installed in the developed simulator. The camera effect comprises pan, tilt, and zoom, and a camera director creates lightning effects while evaluating the movements of virtual dancers on the virtual stage. In this study, four cameras were used, three of which were used for camera pan, tilt, and zoom control, and the fourth was used as a fixed camera for a full shot. Video shooting is performed according to the pan, tilt, and zoom values of the three cameras and switcher data. Only the video of dancers recorded during rehearsal and that produced by the lighting director via the existing broadcast filming process is overlapped in the developed simulator to assess lighting effects. The lighting director assesses the overlapping video and then corrects parts that require to be corrected or emphasized. The abovementioned method produced better lighting effects optimized for music and choreography compared to existing lighting effect production methods. Finally, the performance and lighting effects of the developed simulator and system were confirmed by shooting using K-pop using the pan, tilt, and zoom control plan, switcher sequence, and lighting effects of the selected camera.http://dx.doi.org/10.1155/2022/2724804 |
spellingShingle | Wonjun Lee Hyung-Jun Lim Mun Sang Kim Development for Multisensor and Virtual Simulator–Based Automatic Broadcast Shooting System International Journal of Digital Multimedia Broadcasting |
title | Development for Multisensor and Virtual Simulator–Based Automatic Broadcast Shooting System |
title_full | Development for Multisensor and Virtual Simulator–Based Automatic Broadcast Shooting System |
title_fullStr | Development for Multisensor and Virtual Simulator–Based Automatic Broadcast Shooting System |
title_full_unstemmed | Development for Multisensor and Virtual Simulator–Based Automatic Broadcast Shooting System |
title_short | Development for Multisensor and Virtual Simulator–Based Automatic Broadcast Shooting System |
title_sort | development for multisensor and virtual simulator based automatic broadcast shooting system |
url | http://dx.doi.org/10.1155/2022/2724804 |
work_keys_str_mv | AT wonjunlee developmentformultisensorandvirtualsimulatorbasedautomaticbroadcastshootingsystem AT hyungjunlim developmentformultisensorandvirtualsimulatorbasedautomaticbroadcastshootingsystem AT munsangkim developmentformultisensorandvirtualsimulatorbasedautomaticbroadcastshootingsystem |