게임 정보
Motion Capture Fusion [VR] is an immersive roomscale mocap sandbox for artists and animators who wish to create and export motion capture animations, or create live content, using conventional VR hardware. With as little as a single VR HMD and two controllers users may create mocap on their own avatars. Advanced users may create more detailed motion capture including full body tracking and can be combined with other sensors (eg. Apple IPhone Truedepth sensor and Oculus Quest 2 optical finger tracking) to connect avatars to simultaneous inputs. This fusion of multiple sensor can combine many layers of motion capture in a single take; including: full body tracking, face capture, lipsync, gaze tracking, optical finger tracking.
Highlights
Supported Output Formats
Live-Link (live production) and Live-Recording plugins
Include custom avatars and use the avatar throughout the production workflow. This eliminates the need for retargeting and ensures the mocap data always fits 1:1 without causing any offsets in the final results.
One of the unique features of Mocap Fusion is that it has the ability to export motion capture data and reconstruct the scene in Blender, making it available for final rendering in minutes.
Compatible Headsets (VR HMDs)
Optional Tracking Hardware
Capabilities
Gameplay
The experience depends on the user's PC and the tracking hardware used. The recommended SteamVR headsets are the Valve Index or the HTC Vive. A Quest HMD may also produce reasonable results. It is also possible to use the software without an HMD (eg. when livestreaming). Full body tracking is only available when using feet and hip trackers (and optional elbows, knees, chest). Users may achieve more realistic tracking results when using body trackers. Body trackers are optional and standing mocap is supported. Further realism my be achieved on compatible avatars by also enabling face capture or using a Vive Pro Eye for gaze and blink tracking.
History
Originally this was designed as an intuitive way for users to create virtual training videos and presentation in an immersive VR environment for added realism and then export their animation for rendering. The project was made available to a community for beta testing and since has received feedback and many feature requests which has helped add to the utility of the software for a verity of different creators.
Highlights
- Record motion capture using custom avatars, scenes and props.
- Add custom shaders, textures, emotes expressions, dynamicbone physics and more.
- Vive pro eye, Vive lip tracking and Iphone ARkit facecap support.
- HTC Vive, Valve Index, Oculus Rift, Oculus Quest2, some WMR headsets.
- Can be used without VR, trackers only, or with only IPhone head tracking.
- Add multiple avatars into a scene, build storyboards, react to other prerecorded avatars.
- Camera motion capture and zoom, player acts as the cinematographer in VR.
- VTOL and Fixed Wing flight simulation vehicle platforms for aerial photography shots.
Supported Output Formats
- Exports to SFM (.dmx).
- Exports to Blender (.blend).
- Exports to Unity (.anim).
Live-Link (live production) and Live-Recording plugins
- Live Link plugin available for Unreal Engine (live avatar sync, live recording).
- Live Link plugin available for Blender (live avatar sync, live recording).
- Live Link plugin available for Unity (live avatar sync).
Include custom avatars and use the avatar throughout the production workflow. This eliminates the need for retargeting and ensures the mocap data always fits 1:1 without causing any offsets in the final results.
One of the unique features of Mocap Fusion is that it has the ability to export motion capture data and reconstruct the scene in Blender, making it available for final rendering in minutes.
Compatible Headsets (VR HMDs)
- Valve Index
- HTC Vive (and Vive Pro Eye).
- Oculus Quest (1 and 2).
Optional Tracking Hardware
- SteamVR Vive trackers.
- IPhone Truedepth sensor (facecap and eye tracking).
- Oculus Quest 2 (full optical finger tracking).
Capabilities
- Export mocap and create scenes in Blender™ instantly.
- HTC™ Vive Trackers (Up to 11 optional points) full body tracking.
- Ability to record, playback, pause, slomo, scrub mocap in VR.
- Customizable IK profiles and avatar parameters.
- SteamVR Knuckles support for individual finger articulation.
- Quest 2 optical finger tracking app for individual finger articulation and finger separation.
- Vive Pro Eye blink and gaze tracking support.
- Sidekick IOS Face capture app (Truedepth markerless AR facial tracking).
- User customizable Worlds, Avatar and Props may be built for mocap using the APS_SDK.
- Compatible with existing Unity3D™ avatars and environments.
- Supports custom shaders on mocap avatars.
- DynamicBone support for adding hair, clothing and body physics simulation to avatars.
- Breathing simulation for added chest animation.
- Add/Record/Export VR Cameras for realistic camera mocap (eg. VR Cameraman effect).
- Optimization for exporting mocap (.bvh) data to Daz 3D.
- Placement of "streaming" cameras for livestreaming avatars to OBS or as desktop overlays.
- Microphone audio recording with lip-sync visemes and recordable jaw bone rotation.
- Storyboard mode, save mocap experiences as pages for replaying or editing later.
- Animatic video player, display stories and scripts, choreograph movement.
- Dual-handed weapon IK solvers for natural handling of carbines.
- Recordable VTOL platform for animating helicopter flight simulation (eg. news choppers).
- VR Camcorders and VR selfie cams may be rigidly linked to trackers.
- VR props and firearms may be rigidly linked to trackers.
- Ghost curves for visualizing the future locations of multiple avatars in a scene.
Gameplay
The experience depends on the user's PC and the tracking hardware used. The recommended SteamVR headsets are the Valve Index or the HTC Vive. A Quest HMD may also produce reasonable results. It is also possible to use the software without an HMD (eg. when livestreaming). Full body tracking is only available when using feet and hip trackers (and optional elbows, knees, chest). Users may achieve more realistic tracking results when using body trackers. Body trackers are optional and standing mocap is supported. Further realism my be achieved on compatible avatars by also enabling face capture or using a Vive Pro Eye for gaze and blink tracking.
History
Originally this was designed as an intuitive way for users to create virtual training videos and presentation in an immersive VR environment for added realism and then export their animation for rendering. The project was made available to a community for beta testing and since has received feedback and many feature requests which has helped add to the utility of the software for a verity of different creators.
76561198073881640
VR 장비를 활용한 모션캡쳐 프로그램 VR HMD와 VIVE 트래커를 기본 지원하며, 오큘러스의 카메라 핸드 트래킹과 퀘프로나 바이브 프로 아이의 페이셜을 사용하여 기본적인 모션캡쳐를 진행할 수 있는 프로그램입니다. 기본적으로 VR챗과 같이 아바타를 유니티와 SDK를 이용하여 제작하고, 이 아바타로 모션캡쳐하여 블렌더(.blend) 파일로 내보내지며, 모캡시 사용된 아바타, 도구, 소리 등 모두 담긴채로 블렌더 파일로 변환됩니다. 개발자의 피드백과 소통이 매우 빠른편이며, 버그나 추가되었으면 하는 기능을 디스코드에서 말해주면 거의 바로 관련해서 응답해줍니다. 열정적인듯 추가적으로 블렌더, 언리얼, 유니티에서 실시간으로 모션캡쳐가 가능한 라이브링크 기능이 사용이 가능해, 굳이 중간에 제작 및변환이 없어도 게임엔진에서 바로 사용이 가능한듯 합니다. 단 언리얼은 라이브 링크 플러그인 가격이 거의 8만원, 9만원으로. 거의 모캡 프로그램 본체의 가격과 비슷해 구매하기 약간 부담되는 가격이라 디스코드에서 배포하는 이전버젼 (~5.1) 라이브링크 플러그인 선택지도 있는거 같습니다. 아마 블렌더쪽에서 vr로 모션캡쳐를 진행하고 싶으시다면 좋은 선택일듯 합니다. 10만원대에 전체모캡/핸드트래킹 같은 부가기능 지원이 가능한게 찾아보기 쉽진 않아서요. Vr로 모캡을 할 수 있는 타 프로그램이 아마 트래커 활용해서 몸만 추적하는게 150달러인가? 200달러인거 그랬던걸로 기억해서 그중에선 다름 괜찮아 보입니다. 아직 이거저거 써보는 중이라 좀 더 보고 마저 리뷰 써봐야겠네요. + 디스코드에서 정보좀 뒤적거려보니, 바이브 트래커만으로도 모캡이 일단은 가능한듯 합니다. HMD가 필수는 아닌듯. + 디스코드 들어가면 위키가 있는데 위키는 좀 옛날버젼의 UI를 적어놓은듯 헤서, FUSION MOCAP 사이트가 별도로 있으니 그걸 참고하는게 나을듯 합니다. 근데 그냥 막히거나 하고싶은거 어떻게 해야하냐 디스코드에 질문하면 개발자가 나와서 영상 하나하나 짚어줌 ㅋㅋ;