Your payment text field messages are monitored, so don’t be an egg

We have all done it. Transferred money to a friend after paying for lunch and leaving a “funny” message or remark in the payment text field. For my part, I have had my fair share of sexual innuendos or notes indicating illegal activity associated with a payment of $ 20 that was actually for a burger. But Austrac is watching and while he doesn’t want you to stop having fun, he wants to prevent this feature from being abused.

In early 2017, the Australian Transaction Reports and Analysis Center (Austrac) launched a public-private initiative to follow the money trail with the aim of “harnessing and energizing the collective knowledge of government and industry”.

Comprised of the financial sector, non-governmental organizations, law enforcement and national security agencies, the Fintel Alliance was designed to fight money laundering and terrorist financing, with a mandate to ‘collectively stop the criminals.

Some of its remit has since expanded to cover more national issues, such as the misuse of the payment text field.

In a report released Friday, the Fintel Alliance said it had identified an increase in the misuse of payment text fields in financial transactions as a method of criminal communication or abuse, rather than as a primary goal of transferring funds.

“Instead, transaction text fields are increasingly used to communicate for the purpose of stalking, harassing and threatening behavior, or to avoid scrutiny by law enforcement,” writes -he.

Payment text fields that explicitly threaten a victim or contain profanity considered abusive or offensive are typically identified by financial service providers. However, Austrac said that a significant challenge in identifying legitimate cases of criminal communication is the high volume of “false positives” detected when cross-referencing payment text fields with pre-made lists of terms deemed inappropriate.

There are a whole bunch of criteria that banks take into account when determining that something is wrong, like frequency of payments, value (for example, if it’s a dozen transactions of $ 1 in quick succession) and the already known relationship between the two parties.

But when it comes to finding abuse, AI is used to search payment text field messages for known ‘bad’ words, as well as determining sentiment.

Words or phrases that have double meanings often appear in the payment text field and present detection challenges. For example, the words “pig” and “dog” have common meanings and appear in legitimate, non-threatening payment text fields, but can also be used in threatening or abusive ways. And it’s not even “swears” that can actually be affectionate terms in Australian slang.

Westpac has announced that it will no longer allow the sending of messages containing abuse in transaction descriptions. In February, he said he wanted to create a safer digital banking experience for customers and send a clear signal that abusive messaging in payment transactions will not be tolerated. And the Commonwealth Bank has done the same too, developing an AI machine learning model to help it further identify payments with abusive transaction descriptions.

If you or someone you love needs help, please call LifeLine Australia to 13 11 14, the National Sexual Assault and Domestic Violence Counseling Service to 1800 737 732 Where MensLine Australia to 1300 789 978.

If life is in danger, call 000.

Comments are closed.