From the report: PTRA tools and other algorithmic risk assessment tools used in the Criminal Justice System are used widely and largely without adequate regulation. While these tools can automate certain parts of an overburdened bail system, they have been shown to have a significant discriminatory impact and a limited positive impact on outcomes. They rely heavily on historically biased law enforcement data and they stigmatize poverty as well as certain immutable characteristics. Developers of these tools do not address these societal problems —they simply encode them. And while use of these tools has proliferated, so have criticisms and legal challenges. Accordingly, EPIC recommends that transparency and accountability measures be put into place to ensure that these tools do not further embed systemic biases.This is a report of the Electronic Privacy Information Center (EPIC), updated September 2020.