This website requires JavaScript.
Explore
Help
Sign In
verachen
/
VoxCPM
Watch
1
Star
0
Fork
0
You've already forked VoxCPM
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
Files
8cfd9d155a9d240b6b1a15db45ac221c15dfba09
VoxCPM
/
src
/
voxcpm
History
jayllfpt
de11c6a8cb
OPTIMIZE: Improve sample length computation by using batch column access
2025-12-20 06:32:39 +07:00
..
model
FIX:When a prompt is present, concatenate two patches as the context for VAE decoding
2025-12-15 20:37:02 +08:00
modules
Update: VoxCPM1.5 and fine-tuning supprt
2025-12-05 21:04:51 +08:00
training
OPTIMIZE: Improve sample length computation by using batch column access
2025-12-20 06:32:39 +07:00
utils
Replace the text normalization library
2025-09-16 22:16:40 +08:00
__init__.py
init
2025-09-16 11:46:47 +08:00
cli.py
Modify lora inference api
2025-12-05 22:22:13 +08:00
core.py
add lora funetine webUI; optimize lora save and load logic
2025-12-09 21:34:39 +08:00
zipenhancer.py
surport load model from local path
2025-09-16 16:46:44 +08:00