Palo Alto Networks Shines Light on Application Services Security Challenge
An analysis published by Palo Alto Networks finds a typical large organization adds or updates over 300 services every month, with those new and updated services being responsible for approximately 32% of new high or critical cloud exposures.
Conducted by the Unit42 research team at Palo Alto Networks, the analysis is based on several petabytes of data that Palo Alto Networks collected on its Cortex Xpanse platform for discovering IT assets.
Greg Heon, senior director of product management at Palo Alto Networks, said the analysis suggests that many organizations are dynamically adding and updating services at rates that are much faster than cybersecurity teams can track. As a result, there is a high probability that vulnerabilities will be created as application development teams continuously add additional code to services, he added.
In some sectors, such as telecommunications, insurance, pharmaceutical and life sciences more than 1,000 are being updated or added every month, while financial services, healthcare and manufacturing organizations typically add more than 200 services monthly.
Overall, Unit42 researchers discovered that 26% of exposures found involved critical IT and security infrastructure.
Professional and legal services, high technology, manufacturing, healthcare, finance and wholesale and retail accounted for 63% of the vulnerabilities discovered.
Just three categories of exposures — IT and networking infrastructure, business operations applications and remote access services (RAS) account for 73% of high-risk exposures.
It’s not likely that the rate at which services are being added and updated is expected to slow so there is a need to find ways to automate processes in a way that results in fewer vulnerabilities being created. Otherwise, cybersecurity teams will only continue to be overwhelmed by vulnerabilities, especially as developers rely more on artificial intelligence (AI) to write code of varying levels of security quality faster than ever.
Making matters more challenging, many of the services being updated reside in cloud computing environments that application developers are configuring using code. However, the developers who write that code have limited cybersecurity expertise, so the likelihood a cloud service will be misconfigured is significant.
In general, the attack surface that organizations are trying to defend is only going to continue to expand, with most of the modern platforms being used being configurable using code. Cybersecurity teams require persistent visibility into those platforms to ensure that each time they are updated there isn’t a new potential exploit created, said Heon.
The issue is most of those cybersecurity teams have limited resources so ensuring the security of all the attack surfaces that need to be defended represents a major challenge, he added.
Hopefully, the rise of artificial intelligence (AI) technologies will result in both fewer mistakes being made by developers and faster discovery and resolution of issues. Palo Alto Networks, for example, is adding AI co-pilots across its entire portfolio, noted Heon.
Nevertheless, no matter how much automation is applied to security there is no substitute for external vigilance. After all, cybercriminals are now taking advantage of the same advances to discover and attack services each time they are added or updated.