Suppose you have implemented a fingerprint system and realized after some time that the poor quality of stored fingerprint templates is preventing you from achieving your goal (e.g. accurately identify individuals).
Then you found some corrective measures to improve the quality of fingerprint template (e.g. increasing quality capture threshold, improving SOP, retraining users, etc.). This allows you to take fingerprint with much better quality. Then the question is: how to deal with your legacy templates? Is it better to:
restart from scratch with a new database, retake patient FP as they come back to the facility, and generate a new Unique ID for these patients?
- ‘replace’ patient templates as they return to the facility? (then how to eliminate old poor quality templates?, based on which criteria? etc.)?
- restart from scratch → The problem with this method is that you will be losing potentially helpful/important information.
What is important to note is that in the ISO format, multiple representations of each FP can and should be captured. Effectively, these ISO format files can grow slightly as better quality FPs are captured and then stored. If you do not want to deal with larger files, then what needs to be used is the quality record field’s quality score (assuming this exists and was performed well during the capture process).
- If the score is significantly higher than the existing FP’s score, that FP can be replaced with the newer better one.
- If the score is only marginally higher/still not a very high score on its own, a second representation can be stored and your matching algorithms can account for the multiple representations.