nnsvs.mdn
Layers
 class nnsvs.mdn.MDNLayer(in_dim, out_dim, num_gaussians=30, dim_wise=False)[source]
Mixture Density Network layer
The input maps to the parameters of a Mixture of Gaussians (MoG) probability distribution, where each Gaussian has out_dim dimensions and diagonal covariance. If dim_wise is True, features for each dimension are modeld by independent 1D GMMs instead of modeling jointly. This would workaround training difficulty especially for high dimensional data.
Implementation references: 1. Mixture Density Networks by Mike Dusenberry https://mikedusenberry.com/mixturedensitynetworks 2. PRML book https://www.microsoft.com/enus/research/people/cmbishop/prmlbook/ 3. sagelywizard/pytorchmdn https://github.com/sagelywizard/pytorchmdn 4. sksq96/pytorchmdn https://github.com/sksq96/pytorchmdn
 forward(minibatch)[source]
Forward for MDN
 Parameters:
minibatch (torch.Tensor) – tensor of shape (B, T, D_in) B is the batch size and T is data lengths of this batch, and D_in is in_dim.
 Returns:
 Tensor of shape (B, T, G) or (B, T, G, D_out)
Log of mixture weights. G is num_gaussians and D_out is out_dim.
 torch.Tensor: Tensor of shape (B, T, G, D_out)
the log of standard deviation of each Gaussians.
 torch.Tensor: Tensor of shape (B, T, G, D_out)
mean of each Gaussians
 Return type:
torch.Tensor
Loss
Calculates the error, given the MoG parameters and the target. 
Inference
Return the mean and standard deviation of the Gaussian component whose weight coefficient is the largest as the most probable predictions. 

Sample from mixture of the Gaussian component whose weight coefficient is the largest as the most probable predictions. 