Reduce time complexity of evaluation of testset when using transform with CalculateFeatureContribution in training for explainability in ML.NET

59 views Asked by At

I needed to reduce time complexity of the transform due to performance issues as CalculateFeatureContribution provided by ML.NET 1.7.0(https://learn.microsoft.com/en-us/dotnet/api/microsoft.ml.explainabilitycatalog.calculatefeaturecontribution?view=ml-dotnet) is increasing the time taken multifolds

        var testSet = mlContext.Data.LoadFromEnumerable<ModelInput>(evalset);

        IDataView predictions;
        using (var v = new startwatch("Evaluate"))
        {
            predictions = model1.Transform(testSet); //evaluates each testcase and computes an array of feature contribution.

        }

This generic function is used to evaluate the testcases.

Time complexity when CalculateFeatureContribution is included in training pipeline is 4.486 ms per test item.

Time complexity when CalculateFeatureContribution is NOT included in training pipeline is 0.756 ms per test item.

The excess time taken is to compute weight of each feature for each item, but is there a possible way to reduce the time taken when CalculateFeatureContribution is included.

0

There are 0 answers