Skip to content

News

Rogers Opening Statement at Biometric Technology Hearing

February 6, 2020

Rogers Opening Statement at Biometric Technology Hearing

WASHINGTON – Rep. Mike Rogers (R-Ala.), House Homeland Security Committee ranking member, today delivered an opening statement at a hearing entitled, “About Face: Examining the Department of Homeland Security’s Use of Facial Recognition and Other Biometric Technologies, Part II.”

After the tragic events of September 11th, Congress recognized that biometric systems are essential to our homeland security.

Following the recommendation of the 9/11 Commission, Congress charged DHS with the creation of an automated biometric entry and exit system.

Customs and Border Protection and the Transportation Security Administration have already demonstrated the capability of biometrics to improve security, facilitate travel, and better enforce existing immigration laws.

Government and the private sector have made enormous strides in the accuracy, speed, and deployment of biometrics systems.

Biometric technologies of all types have seen improvements.

The advances in facial recognition algorithms in particular are transformational.

The National Institute of Standards and Technology is the leader in testing and evaluation for biometric technologies.

Dr. Romine and his team have done incredible work to help Congress, DHS, and industry understand the capability of currently available algorithms.

But I’m concerned that some of my colleagues have already jumped to misleading conclusions regarding the NIST report on facial recognition.

Just hours after NIST released over 1,200 pages of technical data, the majority tweeted “This report shows facial recognition is even more unreliable and racially biased than we feared…[”

If the majority had taken the time to read the full report before tweeting, they would have found the real headline: NIST determined that the facial recognition algorithm being adopted by DHS had no statistically detectable race or gender bias.

In other words, NIST could find NO statistical evidence that the facial recognition algorithm DHS is adopting contains racial bias.

NIST found measurable and significant errors and bias in OTHER facial recognition algorithms, but NOT in the algorithm used by DHS.

I hope that my colleagues will listen when Dr. Romine explains how the NIST report proves that race or gender bias is statistically undetectable in the most accurate algorithms.

The reality is that facial recognition technologies can improve existing processes by reducing human error.

These technologies are tools that cannot and will not replace the final judgement of CBP or TSA officers.

Concerns regarding privacy and civil rights are well intentioned.

But these concerns can be fully addressed in how biometric systems are implemented by DHS.

I look forward to hearing the steps CRCL is taking to coordinate with CBP and protect the privacy and civil rights of Americans.

But as I have said before, halting all government biometric programs is not the solution.

Doing so ignores these critical facts: The technology DHS uses is NOT racially biased; It does NOT violate the civil rights of Americans; It IS accurate; And most importantly, it DOES protect the homeland.

I appreciate the Chairman calling this hearing today. It’s important for Congress to further educate itself on this issue. I look forward to getting the facts on the record.

 

 

###