게임 정보
VBridger allows VTubers to use face tracking to it's fullest potential
VBridger allows for the augmentation of tracking data, allowing users to combine, mix, and modify live data for use with VTuber Avatars. It comes with several base settings and samples for different types of models. If you have a standard live2D model, you can use our VTS Compatible settings to enhance the tracking quality. If you have a model rigged for ARKit using our parameter guide, you will be able to use our ARKit settings for advanced expression control. If you have an ARKit compatible VRM model, you can use VMC to send tracking data, allowing you to use input curves to tune and calibrate the tracking to better fit your face.
With the VBridger - Editor DLC, riggers can unlock the full potential of VBridger and their rigs by gaining the ability to create new outputs and custom controls for their models. Use your face to toggle VRM exressions via VMC, or create logic based expressions to add flutter to your live2D wings- the possibilities are endless.
Current available input sources:
•Tracking data from the IPhone ARKit FACS based face tracking system from the following apps
•ifacialmocap (iPhone)
•FaceMotion3D (iPhone)
•MeowFace (Android)
•VTubeStudio (iPhone)
•MediaPipe (Webcam)
•NVIDIA (Webcam) *Requires the NVIDIAa webcam DLC from VTube Studio, it's free!
•Additionally, use your microphone to generate audio inputs.
Current available output software:
•Vtube Studio via the Vtube Studio API allowing for the control of live2D models.
•Virtual Motion Capture (VMC) Protocol allowing for any VMC compatible app to recieve face data from VBridger. As long as an output shares the name of a blendshapeclip on your VRM, you can control it with VBridger.
VMC will only work on the facial tracking of the model, it cannot send head rotation, eye rotation, or control the body at the moment.
More on the way!
VBridger allows for the augmentation of tracking data, allowing users to combine, mix, and modify live data for use with VTuber Avatars. It comes with several base settings and samples for different types of models. If you have a standard live2D model, you can use our VTS Compatible settings to enhance the tracking quality. If you have a model rigged for ARKit using our parameter guide, you will be able to use our ARKit settings for advanced expression control. If you have an ARKit compatible VRM model, you can use VMC to send tracking data, allowing you to use input curves to tune and calibrate the tracking to better fit your face.
With the VBridger - Editor DLC, riggers can unlock the full potential of VBridger and their rigs by gaining the ability to create new outputs and custom controls for their models. Use your face to toggle VRM exressions via VMC, or create logic based expressions to add flutter to your live2D wings- the possibilities are endless.
Current available input sources:
•Tracking data from the IPhone ARKit FACS based face tracking system from the following apps
•ifacialmocap (iPhone)
•FaceMotion3D (iPhone)
•MeowFace (Android)
•VTubeStudio (iPhone)
•MediaPipe (Webcam)
•NVIDIA (Webcam) *Requires the NVIDIAa webcam DLC from VTube Studio, it's free!
•Additionally, use your microphone to generate audio inputs.
Current available output software:
•Vtube Studio via the Vtube Studio API allowing for the control of live2D models.
•Virtual Motion Capture (VMC) Protocol allowing for any VMC compatible app to recieve face data from VBridger. As long as an output shares the name of a blendshapeclip on your VRM, you can control it with VBridger.
VMC will only work on the facial tracking of the model, it cannot send head rotation, eye rotation, or control the body at the moment.
More on the way!
76561199204821859
영어 잘하고 전문적인 일 할수있는사람에겐 강력추천 추가적인 기능이 많이 들어가서 좋지만 세세하게 뜯어서 보정을 해야하기에 사기전 각오를 해야할것 아직 rtx베타 기능에 오류가 많기에 쓰기전 잘 맞는지 확인하자