How VTube Studio’s NVIDIA integration is making VTubing more accessible

How VTube Studio’s NVIDIA integration is making VTubing more accessible

VTubing can be easy, but at the higher end it can be an expensive and intensive process. VTube Studio and NVIDIA are looking to change that, with new technology looking to make the streaming medium more accessible to all.

Getting into VTubing can be as simple as getting a couple of PNG images, plugging them into a Discord bot, and going live on Twitch. However it can also be a complex mix of Live2D or 3D models, with individual body parts animated and rigged to follow the streamer’s every move.

That upper echelon of VTubing can be quite hard to access. It requires high-quality equipment ⁠— either an iPhone or iPad to use face tracking, or a beefy enough computer to track those same expressions from a webcam. Either way, it’s a big hardware investment to get things running smoothly.

VTube Studio is one such face tracking software for VTubing, and one of the most popular out there. The PC program lets users load up their model before using a webcam or smartphone to track movements and mimic them on the model.

Denchi, the person behind VTube Studio, knows just how resource intensive the whole medium can be. While it’s inherently a PC application, many users opt to run it on their smartphone and send tracking data to their computer to cut down on CPU and GPU usage.

“Over the last few years, I’ve tried out pretty much every face tracking framework, but they are often unstable, highly experimental, or prohibitively expensive,” they told Dexerto.

“Most people use either the webcam-based face tracking or the iOS face tracking right now. The existing webcam face tracking in VTube Studio, an open-source library called OpenSeeFace, is already really impressive, especially considering it’s made from scratch by a single person.

“But both the webcam-based tracking and iOS tracking have their issues. The webcam-tracking is relatively resource-intensive and isn’t as accurate as iOS tracking, while the iOS tracking is very accurate and tracks more facial features, but users need an expensive iPhone or iPad to use it.”

However that barrier of entry is coming down even further with a new collaboration between VTube Studio and NVIDIA. The new NVIDIA Broadcast face tracking feature reduces the load on GPUs for VTubers looking to keep everything on their computer, and the Live2D program is one of the first to take advantage of it.

It has been “optimized for running most of the face tracking AI code… on its highly performant tensor cores that all their RTX series cards have” ⁠— the same stuff that makes your AAA games look silky smooth on PC, but can now also help with face tracking.

It also looks smoother without impacting performance too much ⁠— in fact potentially surpassing what’s currently on the market, Denchi claims.

“The performance impact will be minimal and the tracking can be run alongside even the most demanding of games,” they continued. “The accuracy of the NVIDIA face tracking is also extremely good, coming really close to the quality of the current iOS tracking, perhaps even surpassing it in some aspects.”

The feature doesn’t just help VTube Studio, but any developer looking to use face tracking on NVIDIA GPUs. It opens up a wealth of opportunities for development in the VTubing space which could see the barrier of entry drop even further.

It’s a space NVIDIA is firmly trying to position itself in too. Gerardo Delgado Cabrera, the Product Line Manager at NVIDIA Studio working on the new Broadcast features, said it’s part of long-term plans to help “optimize” the VTubing space.

“As part of NVIDIA Studio we work with all the top creative apps – as well as upcoming ones,” he told Dexerto. “And one of the hottest areas of development in live streaming is VTubing.

“We reached out to all the top VTubing apps months ago, and started working with all of them to help them optimize their apps. There’s actually been improvements shipped already through NVIDIA Studio drivers to help with optimization and stability.

NVIDIA Broadcast’s face tracking will go live in October, with an update pushed to VTube Studio at the same time. This will help around 30% of users who have RTX-enabled GPUs. The update will also be completely free for all, and the manufacturer is working with the VTubing community to continually add new features and updates.

This includes a new tool in the augmented reality software development kit at NVIDIA called Face Expression Estimation, which “helps animate face meshes that can convey better emotions,” said Delgado.

It poses itself as a big leap for the tech side of the VTubing space, but at the end of the day, it’s only a small part of the experience. There’s still plenty of growth in terms of what VTubers could become, and Denchi is going to continually explore that with VTube Studio.

“I think tracking is certainly going to improve, but I also think it’s important to remember that tracking is only one aspect of VTubing. Personally, most of the VTubers I watch regularly have very basic tracking and often pretty simple models.

“At the end of the day, VTubers aren’t really that different from normal streamers. People watch VTubers because they like their personalities and stream content. While a good tracking setup can be helpful, nothing can replace a fun personality and interesting stream content.

“That’s what I want to focus on with VTube Studio. Most of the features I plan to add in the future are focused on improving viewer interaction and collaboration with other VTubers. That’s what I personally enjoy most and also what I think sets VTubers apart from normal streamers.”

Share This Article