Skip to content

confusing about the meaning of the evaluation result #8

@iPwnXX

Description

@iPwnXX

In the test example jupyter notebook, for some metrics the enhanced result is smaller than noisy one's, while some's are larger; e.g, PESQ should be the larger the better, but it's not the case in the demo.

and the scale/range of the result isn't mentioned in readme, I know some lie in 0 to 5, but not familiar with others.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions