Modern Science in Cyber Crimes

Despite all the attention Cyber security has received however, it’s been a difficult journey so far. The total cost of IT Security is anticipated to exceed $120 billion in 2017 , and that is one of the areas where the IT budgets of the majority of companies has remained the same or even increased in recent financial crisis. However, this hasn’t significantly diminished the vulnerability in software or the attacks of criminal organizations.

The US Government has been working for an “Cyber Pearl Harbor” kind of all-out attack that could disrupt essential services or even cause the destruction of homes and lives. It is believed to be orchestrated by the criminal underbelly of nations such as China, Russia or North Korea.

It is time to completely rethink how we go about to safeguarding the IT infrastructure we use. Security is fragmented and focused on specific solutions that have been used for specific threats such as anti-virus and security filters for spam, intrusion detections, and firewalls. However, we are now in a place that Cyber security systems are more than just wires and tin and software.

They have systemic problems that have an economic, social and political dimension. The fact that systems are interconnected, coupled with a human component, makes IT systems inaccessible from human factors. Complex cyber systems today have an existence of their own. cyber systems can be described as complex, adaptive systems that we’ve attempted to comprehend and solve using traditional theories.

Before we get into the reasons of considering the Cyber System as a Complex system, here’s an outline description of what is a Complex System is. It is important to note that “system” could refer to any combination of people and processes, or even technology that serves a specific objective. The wristwatch you are using, your sub-oceanic coral reefs or the economy of a country are just a few examples of the term “system”.

In simple terms it is the term “complex system” refers to complex System is a system where the elements in the system, and the interactions between them provide a specific behavior which means that a study of its components can’t explain the behavior. In these systems, the causes and effect cannot necessarily be linked and the interactions are not linear. A small alteration could have a massive impact. Also it is as Aristotle stated “the totality is more than the total of its components”. A few of the well-known examples in this context is the city’s traffic system as well as the development of traffic jams. the analysis of cars as well as car drivers is not able to determine the pattern and the development congestions.

A complex adaptive System (CAS) also has features of self-learning, emergence and development within the members in the system. The agents or participants of a CAS exhibit a variety of behaviors. Their behavior and interactions with each other are continuously changing. For Cyber Security News it is very important to be updated.

using “complicated” processes. Complexity is a term used to describe a process that produces unpredictable results no matter how easy the steps may appear. Complex processes are those that involves a lot of complex processes and challenging conditions, but with a certain final result. One of the most common examples is that making tea is complicated (at at least to me… I’ve never been able to make one that tastes exactly the same as the first one) Building cars is complicated. David Snowden’s Cynefin framework provides more of a formal explanation of the concepts.

The area of study isn’t brand new it’s roots can be traced back to research in Metaphysics in the works of Aristotle [8The Metaphysics of Aristotle [8. Complexity theory is mostly influenced from biological processes and is employed in the fields of epidemiology, social science and natural science research for quite a while now. It has been utilized to study economic systems as well as free market systems and is gaining acceptance in analysis of financial risk as well. 

Today, IT systems are created and developed by us (as humans in the human society of IT professionals within an organisation, and the suppliers) together. And we possess all the information we need to know about the systems. So why do we encounter fresh attacks against IT systems each day that we would never have anticipated, attacking vulnerabilities we didn’t even know existed? One reason could be the fact that every IT system is developed by thousands of different people across the entire technology stack from the business application all the way down to the hardware and network components it runs on. This creates a human component to the design of Cyber systems , and possibilities are everywhere for the introduction of weaknesses that can lead to weaknesses.

A majority of companies possess multiple layers of security for their systems that are critical to them (layers of firewalls IDS, hardened OS and strong authentication, etc) However, attacks continue to occur. The majority of the time computers are targeted because of the result of a combination of circumstances rather than a vulnerability that is that is exploited to allow cyber-attacks to succeed. It’s an “whole” that is the events and the actions of the attackers which cause the harm.

Reductionists claim that all machines and systems can be understood through examining the components of it. The majority of modern science and analysis techniques are founded on the reductionist method and, to be honest, they’ve served us well thus far. When you know what each component does, you can really understand the way a wristwatch would perform, and by designing each piece separately it is possible to create a car to behave according to your preferences and by analyzing the location of celestial objects we can forecast the coming Solar eclipse. Reductionism is a firm focus on causality. There is a reason for an effect.

This is the extent that the reductionist viewpoint can help explain the behavior of the system. When it comes to emerging systems such as human behavior Socio-economic systems and biological systems , or Socio-cyber networks, the reductionist method has limitations. Simple examples, such as our human bodies, reactions of a group to a stimulus from a politician or the reaction of the financial markets to the news of a merger or even traffic jams – are not able to be predicted even if you studying the behavior of the members who comprise each of these “systems”.

We’ve traditionally examined Cyber security using the Reductionist perspective, using specific solutions to specific issues and attempted to anticipate what attacks cybercriminals might launch against vulnerabilities that are known to be vulnerable. It’s high time we look at Cyber security using an alternative Holism approach too.

Computer burglaries are more akin to bacteria or viruses than a car or home break-in. The burglar who breaks into a home can’t make use of it as a means to attack neighbours. Also, the weakness in one lock system on an automobile be exploited to millions of other locks across the globe at once.

They’re more similar to infections caused by microbial organisms to the human body. They could spread the infection in the same way that humans do. They could affect significant areas of a species, as long as they’re “connected” to one another and, in the case of serious illnesses, the systems are typically isolated, as are the people who are put in quarantine to stop the spread of infection.

The lexicon used by Cyber systems makes use of biological metaphors such as such as worms, viruses, and more. There are many parallels to epidemiology, however the principles used to design in Cyber systems aren’t compatible with natural nature of selection. Cyber systems are based on the consistency of technology and processes in comparison to the diversity of genes found in the species, which can make a species more resistant to attacks of epidemic.

The Flu pandemic in 1918 claimed the lives of a quarter of a million people, far more than that of the Great War itself. Nearly everyone was affected, but how did it affect 20-40 year olds more so than the rest of us? It could be due to differences in body’s structure, which causes a different reactions to an attack?

The theory of complexity has gained a lot of popularity and proved to be extremely useful in the field of epidemiology, allowing us to understand patterns of the spread of infection and strategies for managing the spread of infections. Researchers are now focusing on applying the knowledge gained from science to cyber systems.

There have been traditionally two approaches that are complementary and different to reduce security risks to Cyber systems used in today’s real-world systems.

This method is based on the testing team of an IT system to identify any flaws within the system that may reveal a vulnerability that could be exploited by hackers. This can be functional testing to ensure that the system is able to provide the right answer in the way it should as well as penetration testing to confirm its resistance to specific attacks, or tests for availability and resilience. The purpose of the testing generally concerns the system it self and not the defenses that are positioned around it.

This can be a good option for systems that are self-contained and relatively simple in which the user’s travel paths are relatively easy to follow. In the case of most interconnected systems, formal validation is not enough as it’s impossible to test everything.

Automation of tests is a well-known method of reducing the dependence of humans on process of validation, but as Turing’s Halting Problem of undecideability demonstrates that it is impossible to construct an automated system that can test another one in all situations. Testing provides only anecdotal evidence of the system’s performance in the situations that it has been tested Automation helps in obtaining the evidence faster.

If a system is not completely tested through formal testing We deploy additional layers of protection that take the form of Firewalls or network segregation , or to encapsulate them in virtual machines that have a limited view of the of the network, etc. Other popular methods of security mechanisms include Intrusion Prevention systems, Anti-virus and others.

This strategy is widely used by many organizations as a way to protect from attacks unknown to them since it’s nearly impossible to ensure in writing that software is not contaminated by any security flaws and will remain that way.

Resilience to disruptions is an essential characteristic of biological systems that is a natural phenomenon. Imagine a whole species of living organisms sharing the identical genetic structure, body structure, and similar immune system and antibodies. the spread of a virus disease would have destroyed the entire communities. It’s not the case because we all are different and each individuals have a different levels of resistance to illnesses.

In addition, certain mission-critical Cyber systems, especially those for those in the Aerospace and Medical industry implement “diversity implementations” of the same functionality . The centralised “voting” functions decide the answer to the user if the results from different implementations are not in line.

Changes in the pattern of access to memory stacks could alter the response to an attack by a buffer overflow the various variants, which could alert the central system of ‘voting’ to indicate that something is not right somewhere. If the input data as well as the functionality of the implementation is the same and there is any variation in the responses of the implementations are a sign of a possible attack.

Multi-variant Execution Environments (MVEE) are being created, in which applications that differ in execution execute in lockstep, and their responses to a request is tracked [1212. These have proved extremely effective in intrusion detection by and attempting to modify the behavior of the program or even identifying weaknesses where the different versions respond differently to requests.

In a similar vein employing the concept of N-version programming, the N-version antivirus was designed in the University of Michigan that had diverse implementations that looked at new files in search of the appropriate virus signatures. It was the result of a more robust antivirus system that was less vulnerable to attacks on itself , and 35% more effective in detecting across the entire estate

Agent Based Modeling is a method of simulation modeling employed to analyze and comprehend the behavior of Complex systems, in particular Complex adaptive systems. The people or groups that interact with each other within the Complex system are represented as artificial agents who operate according to a predefined rules. Agents are able to change their behaviour and alter their behaviour according to the conditions. Contrary to Deductive reasoning , which has been the most commonly used to explain the behavior of economic and social structures, Simulation does not try to generalize the system and the behavior of agents.

ABMs are very popular for studying things like crowd management behavior in the case of fire evacuations and spread of epidemics to understand market behavior and, more recently, an analysis of risk in financial markets. It is a bottom-up model technique in which the behavior for each participant is controlled in a separate manner, and is different from the other agents. The self-learning and evolutionary behaviour of agents can be achieved with various methods, Genetic Algorithm implementation being one of them.

Cyber systems involve interconnections between software modules and connecting logical circuits microchips and the Internet and many people (system customers or users who are end-users). The interactions and actors could be modelled in a simulation model to conduct what-if analyses, anticipate the effects of changes in parameters and the interactions between the various actors in the model. Simulators have been utilized to analyze the performance of applications based on characteristics and user behavior for some time and a few of the top capacity and performance management software tools employ the method. Similar techniques can be employed to analyze the responses cyber security systems in response to threat, constructing an architecture that is fault-tolerant and analyzing the degree of emergent resilience because of the variety of implementation.

Complexity in Cybersystems particularly that of Agent Based modelling to assess the emergence of behavior in systems is a relatively recent field of study that has had very little research being conducted as of now. There’s still a long more to be done before Agent Based Modeling is an enterprise-level proposition for companies. However, given the emphasis upon Cyber security and the inadequacies of our current position, Complexity science is certainly an area where practitioners and academics are putting more attention to.

Leave a comment

Your email address will not be published. Required fields are marked *