Software makers invariably leave security vulnerabilities in their products. They take pains to remove bugs and most test the software thoroughly to reveal shortcomings. They rarely catch all the potential problems even when they apply a lot of time, skill and money. Those who claim software can be thoroughly debugged and vulnerability-free reveal their inexperience and perhaps a questionable agenda.

A new software suite does not operate in isolation. It relies on collaboration with other utility software and firmware for task scheduling, interrupt handling, allocation of resources, and protocols that pass control to code in another location. New software also relies on utility software to retrieve and deposit data into storage and attached devices. When the manufacturer changes utility software, great care is used to avoid conflict with other software. To avoid those clashes, the changes are usually made as formal version changes where the new utility software is aligned with other software that depends on it.

For a brand-new piece of software, consumers become the second phase in testing, either as acknowledged “beta” users or as the early “bleeding edge” adopters. Those early commercial experiences provide software authors with an avalanche of actual malfunctions and awkward behaviors that will be addressed through patches, updates and new versions. Some of those early day changes will address security issues, but rarely will they address all the security weaknesses.

Security weaknesses are whittled away through software “patches” and successive software versions. Users are expected to keep up to date (i.e., install the updates and patches) in order to get the benefits of both superior functionality and reduced security exposure. Keeping up to date with software packages can be a noticeable skilled labor cost. On the other hand, skipping or delaying updates can expose the enterprise to hacker-induced mayhem and punishment through lawsuits and reputation damage.

The Windows XP operating system was introduced in 2001 and became extremely popular for home and commercial PCs. When it became outdated, a successor, Windows 7 was introduced in 2009. Microsoft gave XP users years of notice that it would cease supporting Windows XP. Nevertheless, Windows XP is still used on 7% of PCs despite Microsoft formally halting support in April 2014. Windows 7 was succeeded by Windows 8, then by Windows 10 as the current operating system with full support.

Users of Windows XP should know that they have been relying on an operating system that was not receiving updates needed to thwart security attacks. When software users ignore the heightened security risk of using unsupported software, they are signing up for the inevitable problems. Microsoft should not be considered responsible for a vulnerability in a 16-year-old product that has been unsupported for 3 years. Those who refuse to run up-to-date software carry the most responsibility.

When they smell a payday, malevolent hackers with the resources and time on their hands will stress and probe commercial and government software and networks. They will devise an exploit that pushes the software beyond its intended use and behavior. If their exploit steals money directly, so much the better. If the cyber crook must convert the exploits theft of private information into money, then so be it. If the exploit is suited for extortion and ransom, then it will be used that way. Malevolent hackers will do what it takes to seize the financial, political or intellectual payoff. Many of these criminal hackers successfully hide behind weak or US-hostile regimes.

Wannacry is the latest large scale cyber ransom that pesters government and commerce. The National Security Agency (NSA) routinely stockpiles knowledge of some vulnerabilities that it might use for its legitimate intelligence gathering. Part of that NSA stockpile was inadequately unprotected and was launched into the wild in April 2017 by hackers called “Shadow Brokers.” One element in the stockpile was called Eternal Blue, and that became the foundation for the development of WannaCry, a ransom tool that took advantage of a vulnerability in moribund Windows XP.

Some pundits would prefer that NSA had alerted Microsoft to the windows XP vulnerability so that Microsoft would issue a patch, foreclosing mischief like WannaCry. While Windows XP is an old system, it is probably used by some of NSA’s intelligence targets so the vulnerability may well be useful for intelligence gathering. NSA probably has the more informed insight on the best path for handling each vulnerability — either to gather intelligence or to protect users of stale software. Nevertheless, NSA should have better protected the cluster of cyber-tools that Shadow Brokers made available to hackers.

From a consumer’s perspective, the larger issues are not about WannaCry. The days of the Wild West Internet are over and ordinary people are being harmed as collateral damage in cybercrimes. One unresolved big issue is adopting a case-by-case mechanism for choosing between privacy and security. That mechanism is generally performed by the Foreign Intelligence Security Court, but it is too cloak and daggerish and lacks the public’s confidence. Another issue is the degree of liability applicable to those who design or run insecure computer systems and networks.

Our glacial, overly expensive court system does not deliver meaningful “justice” to consumers. Law enforcement seems to rarely apprehend and convict cybercriminals. Cybercrimes provide plenty of blame to share. What we need most is a competent political leader.