Skip to main content

Table 18 Comparing FCA-Map with the OAEI 2016 top-ranked systems

From: Matching biomedical ontologies based on formal concept analysis

Matching task   XMap AML LogMap LogMapBio FCA-Map
Conference Precision 0.85 0.84 0.82 0.77 0.75
  Recall 0.57 0.66 0.59 0.56 0.52
  F-Measure 0.68 0.74 0.69 0.65 0.61
MA-NCI Precision 0.929 0.95 0.918 0.888 0.932
  Recall 0.865 0.936 0.846 0.896 0.837
  F-Measure 0.896 0.943 0.88 0.892 0.882
FMA-NCI Precision 0.977 0.936 0.949 0.935 0.954
  Recall 0.901 0.902 0.901 0.910 0.917
  F-Measure 0.937 0.931 0.924 0.923 0.935
FMA-SNOMED Precision 0.989 0.953 0.948 0.944 0.936
  Recall 0.846 0.727 0.690 0.696 0.803
  F-Measure 0.912 0.825 0.799 0.801 0.865
SNOMED-NCI Precision 0.911 0.904 0.922 0.896 0.914
  Recall 0.564 0.713 0.663 0.675 0.666
  F-Measure 0.697 0.797 0.771 0.770 0.771
FMA-NCI Whole Precision 0.902 0.838 0.854 0.818 0.409
  Recall 0.847 0.872 0.802 0.835 0.872
  F-Measure 0.874 0.855 0.827 0.826 0.557
FMA-SNOMED Whole Precision 0.965 0.882 0.839 0.808 0.452
  Recall 0.843 0.687 0.634 0.640 0.773
  F-Measure 0.900 0.773 0.722 0.714 0.571
SNOMED-NCI Whole Precision 0.904 0.870 0.842 0.786
  Recall 0.668 0.596 0.637 0.686
  F-Measure 0.768 0.708 0.725 0.732
HP-MP vote 2 Precision 1.000 0.931 0.935 0.918 0.984
  Recall 0.333 0.800 0.913 0.932 0.754
  F-Measure 0.500 0.860 0.924 0.925 0.854
HP-MP vote 3 Precision 1.000 0.854 0.773 0.755 0.942
  Recall 0.435 0.945 0.973 0.982 0.924
  F-Measure 0.606 0.897 0.862 0.854 0.933
DOID-ORDO vote 2 Precision 0.985 0.853 0.952 0.920 0.966
  Recall 0.569 0.971 0.878 0.898 0.959
  F-Measure 0.721 0.908 0.913 0.909 0.962
DOID-ORDO vote 3 Precision 0.977 0.778 0.905 0.864 0.888
  Recall 0.632 0.998 0.938 0.949 0.993
  F-Measure 0.767 0.878 0.921 0.905 0.937