Security applications are subject to the age-old computing axiom of “garbage in, garbage out.” To work effectively, they need the right data. Too much irrelevant data may overwhelm the processing and analytics of solutions and the results they deliver. Too little, and they may miss something crucial. It’s mainly a question of relevance, volume and velocity.
How much data?
One of the most central questions, then, is how much data is enough? What is the correct balance? This is an issue that should not merely be reached through a compromise, but instead represent the optimal level of what is needed. Understanding this level is not merely a question of amount of data but also one of how the data will be used, the level or granularity of the data, the processing performed by the solution and the quality of algorithms or machine learning capabilities, if employed.
At a minimum, there needs to be full coverage or representation for the particular security task. For instance, if the object is to find an attacker who may have gained access to a network and is quietly conducting reconnaissance and lateral movements to gain access to assets, one may be primarily concerned with data center traffic and activity within the corporate network.
Inbound or outbound internet traffic may be less of a concern, unless one is specifically focused on command and control or exfiltration activities, which tend to be inferior vectors for finding attack activity early.
If one is more focused on access control and remote users, access to VPN traffic is essential to the security solution. Other solutions may be more focused on traffic from the internet to detect a malware attack. The object is to look in the right place and ensure that there is not a blind spot limiting a solution’s ability to make the right assessments or produce the proper actions. If a security solution cannot see or lack access to parts of relevant areas of network traffic, it will not be able to properly serve its function.
The next consideration is what data is actually needed. Many solutions need only packet header information, while others need to fully inspect each packet. Obviously, working with metadata is far less burdensome than deep packet inspection. Metadata is “data on data” information that may provide the proper level of expected flow behavior information or telemetry. IPFIX or NetFlow offers a wealth of critical details and may be sufficient for the needs of a security solution. Full packet ingestion should only be performed when necessary and header or flow details will not tell the entire story.
Network packet brokers have provided security solutions with network traffic from the right locations and the right level of detail for some time now. Some of these have been able to host security solutions in virtual environments or simply pass traffic to standalone appliances. Often the solutions have firm limits on the applications they support, typically from an approved list of technology partners.
Requiring certified or approved vendors may help ensure compatibility or even some kind of optimization, but they also impose a lack of flexibility and openness. It’s likely that compatibility is not a real issue and more vendors should favor at least some more openness.
Ideally, the network packet broker would support any or all security solutions, provide traffic from the relevant portions of the network and perform all of the processing necessary for the solution to be able to do its particular job. Security groups would benefit from being able to easily deploy the latest technologies and more appropriate solutions for their needs. In addition, unburdening each security solution from utility processing, such as TLS decryption, metadata extraction, IPFIX generation and other chores can help ensure not only that solutions get the right traffic in the right way but also that they are not slowed down with performance-draining tasks.
This kind of cooperation between a network packet broker and a security solution can help each with the network traffic they need to operate effectively without producing a burden on each solution. Each solution then needs to work efficiently and effectively on the traffic using algorithms that are actually tuned to the data. It’s one thing to have algorithms that logically meet the requirements of a security task, but it is another to hone those algorithms to more effectively fit the data. These streamlined algorithms make best use of data to achieve better fidelity and accuracy.
Security solutions need to keep evolving, but the way they are deployed also needs to evolve. Fixing these two issues will advance an organization’s ability to meet ever-increasing challenges.
Cooperation is required
Organizations have been plagued by security solutions that either completely miss the types of threats they are intended to find or overwhelm security teams with so many alerts that finding a real indication of trouble is like finding a needle in a haystack. Data breaches continue to be a persistent threat. More company intellectual property and secrets are stolen at alarming rates to the extent that a country’s GDP actually suffers. Ransomware is still ramping up. Can network security achieve a new level of effectiveness?
The deficiency is not completely the fault of deficient security solutions. Network infrastructure should take some of the responsibility. Often times the missing of threats or the flood of alerts comprised chiefly of false positives is a result of a garbage in, garbage out data issue—not having the right traffic, being overloaded with irrelevant traffic, drowning in too much detail or not having enough.
Sometimes traffic processing utility functions bog down a security solution’s performance and requires compromise in how much data can be ingested. Such conditions can and should be remedied by new generations of network packet brokers that better serve security solutions. The two need to work more cooperatively—and effectively.
With more cooperation and partnership between security solutions and network infrastructure, network security can advance and better meet the growing challenges and needs of organizations. Perhaps the garbage in, garbage out issues can finally be put to rest and discarded in the trash.