University researchers slammed auditors for leaving banks, government and businesses exposed to intruders through network routers and switches left misconfigured and unpatched.
Charles Sturt University teamed with penetration testers and security researchers for a three-year study of 1323 routers and 452 switches in more than 400 organisations that handled credit cards and other sensitive information.
It found gross oversights in security controls such as routers with default passwords, misconfigured network services and poor or absent access controls.
“We could take over a router and then take over the network,” said report author Craig Wright.
Only 4 percent of routers and 1.2 percent switches were patched and configured, researchers found. It took organisations almost a year on average to patch switches; the devices were never tested by auditors who rarely examined the corresponding client software.
“Consequently, there is little incentive for the organisation under audit to maintain critical systems,” authors wrote.
Organisations had regular audits from “respectable” security firms and some were deemed compliant under the payment-card industry's data security standard and ISO 2700, a security-management standard.
The industry's drive to the “lowest common denominator” meant organisations and auditors chose to overlook serious security flaws in the name of profits, Dr Wright said.
IT staff who had incentives tied to results would often “lie by omission” to pass the tests. And auditors would take their word rather than test and verify, which would treble the audit cost.
Auditors were "watchdogs and not bloodhounds", researchers wrote.
The risks from hacking “leave the auditor seeking the compliance tests that bring them the greatest returns with little risk of fallout when they fail”.
No auditor examined ther subjects' network-equipment firmware during the study and organisations were focused on getting the network auditor’s tick.
“It’s easier not to tackle the gaps and put a junior on the job,” Dr Wright said. “They know what needs to be done to pass the audit and that’s what they focus on.”
Patch policy was present for servers and client operating systems but it took up to three months to fix server holes and 50 days to patch operating systems.
The policies were rarely required for network devices.
Operating system patches for client systems and firewalls were applied and tested within two months.
He scoffed at the popular “tick-box” auditing arrangements where networks were examined no more than every few months; he said they were insufficient to ensure organisations were abreast of security vulnerabilities.
“Spending money to demonstrate compliance does not in itself provide security.”
Government and commercial groups such as the Payment Card Industry were blamed in the report for inflating the importance of compliance schemes, company negligence rules and governance functions when reports to demonstrate compliance were used in place of a “real effort to ensure that data protection occurs”.
The focus of the legal system on “conventional, fault-based tort principles” (litigation) meant a favourable compliance report could absolve an organisation.
Audits should be done weekly and drill into a section of security, researchers wrote. Dr Wright said auditors big and small here and in Britain and the US were guilty.
He said the hallmark of a good auditor was integrity; they should be chosen based on a trial assessment of an organisation’s network and be instructed to test by information security frameworks such as OWASP that looked at web-application security.
“The practice of implementing monitoring controls that do not report on breaches but which do satisfy the compliance needs of an organisation can cost far more in the long term,” researchers concluded.