I've got a short definition for hacking that I like. It's independent of technology and doesn't have many of the nasty implications of many of the mental models people have. This is not as succinct and elegant as the definition Richard Stallman gives, where hacking is "clever playfulness" but I think it is much better as a functional definition.
This definition also works for other types of hackers. There are popular communities and labels of travel hacker, life hacker, social hacker, growth hacker, etc. I've even seen articles about garden hacking. Can you even find an activity or thing that doesn't return any results for a Google search when you append hacker to it?
Laws are another example of accepted expectations and processes. Hackers - and I include social, political and business hackers like entrepreneurs - see the loopholes in laws that were created, either intentionally or otherwise. And that makes them potentially threatening to the governments that created these laws. But not necessarily. A government that seeks to codify accepted expectations and processes should seek out feedback from hackers if they wish to ensure they are creating better laws.
Most of these flaws are not discovered before laws formally codify them. Like in computer hacking, some seek to discover and use these flaws for their own benefit. And some seek to discover and publish them so that they can be fixed. A government's response to this publication is telling about its willingness to make laws with minimal flaws.
Some see a heavy-handed response to quash public knowledge of flaws in the policy and legal code as indications that the flaws were created intentionally. I am not one of those people. I think it instead better resembles the reaction that software makers have when researchers point out flaws in their code. They go through Katie Mousourris' five-stages of vulnerability response grief. (It's a good 7 minute watch.)
I would say these political, economic and other systems must be tested for flaws so they can be addressed. Think of this as hardening politics, the economy and the social order. If we don't help to harden it we are helping those who seek to gain personal advantage from the flaws. We can use structures and frameworks for security testing as blueprints until better ones are available. Or maybe they already are and I've just missed them.
What are your thoughts?
Hacking is a method of circumventing an accepted expectation or system to achieve a goal faster, more efficiently or more effectively than would otherwise result.Here a system can be described as a set of processes. Therefore a computer system is a set of computer processes. Software code defines these processes and the computer executes them with or without human interaction. So computer hacking is merely circumventing these codified processes. The process owner must protect against such circumventions which lead to undesirable outcomes. Failure to do so can be called a security flaw. A person who circumvents processes, then, can be called a hacker. They didn't make the flaw, only found and used it.
This definition also works for other types of hackers. There are popular communities and labels of travel hacker, life hacker, social hacker, growth hacker, etc. I've even seen articles about garden hacking. Can you even find an activity or thing that doesn't return any results for a Google search when you append hacker to it?
Laws are another example of accepted expectations and processes. Hackers - and I include social, political and business hackers like entrepreneurs - see the loopholes in laws that were created, either intentionally or otherwise. And that makes them potentially threatening to the governments that created these laws. But not necessarily. A government that seeks to codify accepted expectations and processes should seek out feedback from hackers if they wish to ensure they are creating better laws.
Most of these flaws are not discovered before laws formally codify them. Like in computer hacking, some seek to discover and use these flaws for their own benefit. And some seek to discover and publish them so that they can be fixed. A government's response to this publication is telling about its willingness to make laws with minimal flaws.
Some see a heavy-handed response to quash public knowledge of flaws in the policy and legal code as indications that the flaws were created intentionally. I am not one of those people. I think it instead better resembles the reaction that software makers have when researchers point out flaws in their code. They go through Katie Mousourris' five-stages of vulnerability response grief. (It's a good 7 minute watch.)
I would say these political, economic and other systems must be tested for flaws so they can be addressed. Think of this as hardening politics, the economy and the social order. If we don't help to harden it we are helping those who seek to gain personal advantage from the flaws. We can use structures and frameworks for security testing as blueprints until better ones are available. Or maybe they already are and I've just missed them.
What are your thoughts?