The global data highlights a worrying trend: Australian organizations are starting to fall behind a “trust gap” between what customers expect them to do with data and privacy and what is actually happening.
New data shows that 90% of people want organizations to transform to properly manage data and risks. With a regulatory environment that has fallen behind, for Australian organizations to achieve this, they will have to move faster than the regulatory environment.
Australian businesses need to be taken seriously
According to a recent major global study from Cisco, more than 90% of people believe that generative AI requires new techniques to manage data and risk (Figure A). Meanwhile, 69% are concerned about the possibility of legal and intellectual property rights being compromised, and 68% are concerned about the risk of disclosure to the public or competitors.
Essentially, while customers appreciate the value that AI can bring them in terms of personalization and service levels, they are also uncomfortable with the implications for their privacy if their data is used as part of AI models.
PREMIUM: Australian organizations should consider an AI ethics policy.
About 8% of Cisco survey participants were from Australia, although the study does not break down previous concerns by territory.
Australians more likely to violate data security policies despite data privacy concerns
Other research shows that Australians are particularly sensitive to how organizations use their data. According to research from Quantum Market Research and Porter Novelli, 74% of Australians are concerned about cybercrime. Additionally, 45% are worried about their financial information being stolen and 28% are worried about their identity documents, such as passports and driver's licenses (Figure B).
However, Australians are also twice as likely as the global average to violate data security policies at work.
As Gartner vice president and analyst Nader Hanein said, organizations should be deeply concerned about this breach of customer trust because customers will be happy to pick up their wallets and walk away.
“The fact is that today's consumers are more than happy to cross the road to the competition and, in some cases, pay a premium for the same service, if that is where they believe their data and that of their family is best taken care of. . Hanein said.
Voluntary regulation in Australian organizations is not good for data privacy
Part of the problem is that, in Australia, doing the right thing on data privacy and artificial intelligence is largely voluntary.
SEE: What Australian IT leaders should focus on ahead of privacy reforms.
“From a regulatory perspective, most Australian companies are focused on breach disclosure and reporting, given all the high-profile incidents over the past two years. But when it comes to basic privacy issues, there are few requirements for businesses in Australia. The main pillars of privacy, such as transparency, consumer privacy rights and explicit consent, are simply missing,” Hanein said.
Only those Australian organizations that have done business overseas and encountered external regulations have needed to improve; Hanein pointed to GDPR and New Zealand privacy laws as examples. Other organizations will need to make building trust with their customers an internal priority.
Build trust in the use of data
While data use in AI may be unregulated and voluntary in Australia, there are five things the IT team can (and should) champion across the organization:
- Transparency about data collection and use: Transparency around data collection can be achieved through clear and easy-to-understand privacy policies, consent forms, and opt-out options.
- Responsibility with data governance: Everyone in the organization should recognize the importance of data quality and integrity in data collection, processing, and analysis, and policies should be in place to reinforce behavior.
- High data quality and accuracy: Data collection and use must be accurate, as misinformation can make AI models untrustworthy, which can subsequently undermine confidence in data security and management.
- Detection and proactive response to incidents: An inadequate incident response plan can cause damage to the organization's reputation and data.
- Customer control over their own data: All services and features that involve data collection must allow the customer to access, manage and delete their data on their own terms and whenever they wish.
Self-regulate now to prepare for the future
Currently, data privacy law, including data collected and used in AI models, is governed by old regulations created even before AI models were used. Therefore, the only regulation applied by Australian companies is self-determined.
However, as Gartner's Hanein said, there is a lot of consensus about the right path forward for data management when it is used in these new and transformative ways.
SEE: Australian organizations to focus on ethics of data collection and use in 2024.
“In February 2023, the Privacy Act Review Report was published with many good recommendations aimed at modernizing data protection in Australia,” Hanein said. “Seven months later, in September 2023, the Federal Government responded. Of the 116 proposals in the original report, the government responded favorably to 106.”
For now, some executives and boards of directors may resist the idea of self-imposed regulation, but the benefit of this is that an organization that can demonstrate that it is taking these measures will benefit from a greater reputation among customers and will be seen as taking take into account your concerns. around the use of data seriously.
Meanwhile, some within the organization might be concerned that imposing self-regulation could impede innovation. As Hanein said in response to that: “would you have delayed the introduction of seat belts, crumple zones and airbags for fear that these aspects would slow down the development of the automotive industry?”
Now is the time for IT professionals to take charge and start closing that trust gap.