A Differentiable Loss Function for Time-Series in CUDA
Soft DTW Loss Function for PyTorch in CUDA This is a Pytorch Implementation of Soft-DTW: a Differentiable Loss Function for Time-Series which is batch supported computation, CUDA-friendly, and feasible to use as a final loss. I can confirm that you can train a (sequential) model with this as a final loss! The following image shows training logs of a TTS model using the Soft-DTW Loss Function. There are some previous implementations: mblondel’s soft-dtw lyprince’s sdtw_pytorch Maghoumi’s pytorch-softdtw-cuda But they are […]
Read more