v1.8.5
🢃 Download Release
Discord Support Server
v1.8.5 Changelog
- fix sending KAT_Pointer even tho KAT is turned off in the settings (for real this time)
- log used device for debugging purposes
v1.8.4 Changelog
- potentially fix a cuda related problem.
- force_update.bat now cleans some broken packages.
v1.8.3 Changelog
- fix some KAT parameters sending, even tho KAT is disabled #16
- bumped a bunch of dependencies
- fix small bug with obs only mode
v1.8.2 Changelog
- fix translation quantization error
- bump ctranslate2 to 3.19.0
v1.8.1 Changelog
- fixed a dependency issue that caused the program to fail installing.
v1.8.0 Changelog
- Support for OSCQuery! (Experimental)
- Automatically finds a port to recieve data from VRChat from. This removes the need for routing applications, if you need to use them.
- Set OSC Server port to 0 in the settings to use the automatic discovery of OSCQuery. Or choose any port you desire to use.
- If you dont want to use OSCQuery or you dont have a need for it, just leave the port at 9001.
- You can now disable the OSC Server completely by setting the port to -1, this is useful for someone that isnt going to use KAT.
- Fix timeout_time and pause_threshold always resetting
- update force_update.bat for CPU only installs
- updater ignores Beta versions, unless already in Beta for future beta versions. (opt-in in the works)
Note
If the install fails or doesn't show up over the auto updater, please run force_update.bat
in the src folder of your install.
Full Changelog: v1.8.0...v1.8.4
Requirements
With default settings, this program has following requirements:
- .NET 4.8.1 (Should be preinstalled on Windows 10 and up)
- Visual C++ 2015-2022 Redistributable (x64)
- SteamVR (IF ran in VR, no Oculus/Meta support as of now.)
- Inference on GPU (Recommended):
- CUDA enabled GPU (NVIDIA ONLY), otherwise it will fall back to using CPU.
- ~11gb of available space for installation, ~6GB of space used after successful installation and loading models.
- ~1GB of available RAM.
- ~600MB of available VRAM.
- Inference on CPU:
- ~4gb of available space for installation, ~2GB of space used after successful installation and loading models.
- ~400MB of available RAM.
Note
Depending on settings changed in the program those requirements can change exponentially.