Confusion Matrix Calculator Application icon

Confusion Matrix Calculator 1.0.0

11.7 MB / 1+ Downloads / Rating 5.0 - 1 reviews


See previous versions

Confusion Matrix Calculator, developed and published by MDApp+, has released its latest version, 1.0.0, on 2023-12-15. This app falls under the Medical category on the Google Play Store and has achieved over 100 installs. It currently holds an overall rating of 5.0, based on 1 reviews.

Confusion Matrix Calculator APK available on this page is compatible with all Android devices that meet the required specifications (Android 4.1+). It can also be installed on PC and Mac using an Android emulator such as Bluestacks, LDPlayer, and others.

Read More

App Screenshot

App Screenshot

App Details

Package name: co.mdapp.confusionmatrix

Updated: 1 year ago

Developer Name: MDApp+

Category: Medical

New features: Show more

App Permissions: Show more

Installation Instructions

This article outlines two straightforward methods for installing Confusion Matrix Calculator on PC Windows and Mac.

Using BlueStacks

  1. Download the APK/XAPK file from this page.
  2. Install BlueStacks by visiting http://bluestacks.com.
  3. Open the APK/XAPK file by double-clicking it. This action will launch BlueStacks and begin the application's installation. If the APK file does not automatically open with BlueStacks, right-click on it and select 'Open with...', then navigate to BlueStacks. Alternatively, you can drag-and-drop the APK file onto the BlueStacks home screen.
  4. Wait a few seconds for the installation to complete. Once done, the installed app will appear on the BlueStacks home screen. Click its icon to start using the application.

Using LDPlayer

  1. Download and install LDPlayer from https://www.ldplayer.net.
  2. Drag the APK/XAPK file directly into LDPlayer.

If you have any questions, please don't hesitate to contact us.

Previous Versions

Confusion Matrix Calculator 1.0.0
2023-12-15 / 11.7 MB / Android 4.1+

About this app

This Confusion Matrix Calculator determines several statistical measures linked to the performance of classification models such as: Sensitivity, Specificity, Positive Predictive Value (Precision), Negative Predictive Value, False Positive Rate, False Discovery Rate, False Negative Rate, Accuracy & Matthews Correlation Coefficient.

Statistical measures based on the confusion matrix

The confusion matrix is the popular representation of the performance of classification models and includes the correctly and incorrectly classified values compared to the actual outcomes in the test data. The four variables are:

True positive (TP) – which is the outcome where the model correctly predicts positive class (condition is correctly detected when present);
True negative (TN) – which is the outcome where the model correctly predicts negative class (condition is not detected when absent);
False positive (FP) – which is the outcome where the model incorrectly predicts positive class (condition is detected despite being absent);
False negative (FN) – which is the outcome where the model incorrectly predicts negative class (condition is not detected despite being present).

One of the most commonly determined statistical measures is Sensitivity (also known as recall, hit rate or true positive rate TPR). Sensitivity measures the proportion of actual positives that are correctly identified as positives.

Sensitivity = TP / (TP + FN)

Specificity, also known as selectivity or true negative rate (TNR), measures the proportion of actual negatives that are correctly identified as negatives.

Specificity = TN / (FP + TN)

The Positive Predictive Value (PPV), also known as Precision and the Negative Predictive Value (NPV) are the proportion of positive and negative results that are true positive, respectively true negative. They are also called positive respectively negative predictive agreements and are measures of the performance of a diagnostic test.

Positive Predictive Value (Precision) = TP / (TP + FP)

Negative Predictive Value = TN / (TN + FN)

The False Positive Rate (FPR) or fall-out is the ratio between the number of negative events incorrectly categorized as positive (false positives) and the total number of actual negative events (regardless of classification).

False Positive Rate = FP / (FP + TN)

The False Discovery Rate (FDR) is a statistical approach used in multiple hypothesis testing to correct for multiple comparisons.

False Discovery Rate = FP / (FP + TP)

The False Negative Rate (FNR) measures the proportion of the individuals where a condition is present for which the test result is negative.

False Negative Rate = FN / (FN + TP)

Accuracy (ACC) is a measure of statistical bias

Accuracy = (TP + TN) / (TP + TN + FP + FN)

The F1 Score is a measure of a test’s accuracy, defined as the harmonic mean of precision and recall.

F1 Score = 2TP / (2TP + FP + FN)

Matthews Correlation Coefficient (MCC) describes how changing the value of one variable will affect the value of another and returns a value between -1 and 1:

+1 describes a perfect prediction;
0 unable to return any valid information (no better than random prediction);
-1 describes complete inconsistency between prediction and observation.

Matthews Correlation Coefficient = (TP x TN – FP x FN) / (sqrt((TP+FP) x (TP+FN) x (TN+FP) x (TN+FN)))

New features

This Confusion Matrix Calculator determines several statistical measures linked to the performance of classification models, such as: Sensitivity, Specificity, Positive Predictive Value (Precision), Negative Predictive Value, False Positive Rate, False Discovery Rate, False Negative Rate, Accuracy & Matthews Correlation Coefficient.

App Permissions

Allows applications to open network sockets.