Saturday, August 23, 2014

Hacking And Politics

I've got a short definition for hacking that I like. It's independent of technology and doesn't have many of the nasty implications of many of the mental models people have. This is not as succinct and elegant as the definition Richard Stallman gives, where hacking is "clever playfulness" but I think it is much better as a functional definition.
Hacking is a method of circumventing an accepted expectation or system to achieve a goal faster, more efficiently or more effectively than would otherwise result.
Here a system can be described as a set of processes. Therefore a computer system is a set of computer processes. Software code defines these processes and the computer executes them with or without human interaction. So computer hacking is merely circumventing these codified processes. The process owner must protect against such circumventions which lead to undesirable outcomes. Failure to do so can be called a security flaw. A person who circumvents processes, then, can be called a hacker. They didn't make the flaw, only found and used it.

This definition also works for other types of hackers. There are popular communities and labels of travel hacker, life hacker, social hacker, growth hacker, etc. I've even seen articles about garden hacking. Can you even find an activity or thing that doesn't return any results for a Google search when you append hacker to it?

Laws are another example of accepted expectations and processes. Hackers - and I include social, political and business hackers like entrepreneurs - see the loopholes in laws that were created, either intentionally or otherwise. And that makes them potentially threatening to the governments that created these laws. But not necessarily. A government that seeks to codify accepted expectations and processes should seek out feedback from hackers if they wish to ensure they are creating better laws.

Most of these flaws are not discovered before laws formally codify them. Like in computer hacking, some seek to discover and use these flaws for their own benefit. And some seek to discover and publish them so that they can be fixed. A government's response to this publication is telling about its willingness to make laws with minimal flaws.

Some see a heavy-handed response to quash public knowledge of flaws in the policy and legal code as indications that the flaws were created intentionally. I am not one of those people. I think it instead better resembles the reaction that software makers have when researchers point out flaws in their code. They go through Katie Mousourris' five-stages of vulnerability response grief. (It's a good 7 minute watch.)

I would say these political, economic and other systems must be tested for flaws so they can be addressed. Think of this as hardening politics, the economy and the social order. If we don't help to harden it we are helping those who seek to gain personal advantage from the flaws. We can use structures and frameworks for security testing as blueprints until better ones are available. Or maybe they already are and I've just missed them.

What are your thoughts?

My Infosec Origin Story

Everybody likes a good origin story. Especially these days with comic book heroes and villains getting origin story movies and TV shows. It explains a lot about the character that you've come to know well and helps you understand motivations and brain wiring.

The other day I was talking to someone about how I got into infosec and I realized I could sum it up pretty quickly. So here's my own personal infosec origin story:

  • Always been a technology guy. But then it started to get boring.
  • Started breaking technology - hacking. Then that got boring.
  • Protected technology and information from the breakers. Boring.
  • Now making and breaking business plans, models and ideas. Disruptive entrepreneurialism. It's not boring yet, but we'll see.

Friday, August 22, 2014

Eliminating Ostrich Effect In CEOs

NPR reports on a new study that demonstrates the Ostrich effect in people. That's where the ostrich buries his head in the ground to avoid danger. The ostriches who do this and then don't die go on to have babies. And the gene for this behavior gets passed on. I don't know if ostriches really do that or not. Still, it's a staple of cartoon humor. Silly birds.

The results of the study indicate that people will pay money to avoid learning health news that might be negative. Specifically, subjects paid $10 to NOT have their blood tested for Herpes Simplex 2 (that's the worse one). Ever have a relative or friend who avoided going to the doctor even though there was clearly a problem? Ostrich effect. Silly birds.

That's a very interesting study. If you think about it, this makes sense:
  • Touching a hot stove hurts. So we avoid touching it by using an oven mitt or simply keeping hands away.
  • Bad news is painful. So potential of bad news makes us avoid information altogether.
This can be generalized to partially explain why many leaders throughout history (and today) have surrounded themselves with toadies. They don't want to hear the bad news so they gradually insulate their mind with the functional equivalent of oven mitts. 

Look at CEOs with cybersecurity. The CISO tends to always bring bad news so they just stop inviting him, or putting him under someone else who can oven mitt him off. Ostrich effect. Silly birds. Silly environmental threats.

Knowing that a CISO can change his content or delivery to reduce this effect. For instance putting today's report in context. "Yes we're not great, but we're doing much better." Or bringing good news along with any bad news, such as "We saved over $300K last year in employee downtime from reducing malware infections. Also we still had lots of malware infections." Or keep smiling throughout the conversation, maybe telling a joke or two to improve the mood. Every little helps. Or bring donuts. Executives love donuts don't they?

Wednesday, August 13, 2014

Are We Magicians?

Computerized technology pervades every aspect of our life, from cars, to medical devices, and increasingly every electronic thing around us. Only a select few people understand this technology well, meaning for most people it is well in advance of what they know how to use and manipulate effectively. As Arthur C. Clarke said, "any sufficiently advanced technology is indistinguishable from magic." For most people the world around us is indistinguishable from magic.

So then people who can bend this technology to their will are indistinguishable from magicians. And that is just what hackers do - use our techniques, tactics and processes to bend this technology to our will. This mastery gives us the power to manipulate and control the world around us. And with great power comes great responsibility. It is time for hackers to assess the way we use our power - or don't use it - and ask whether what we are doing is responsible.

This year at Black Hat and DEF CON, two of the premier hacker conferences, the theme seemed to be hacking altruism. For the past few years I've noticed a trend of information security people advocating for fixing problems, not just finding them. Over the past year or two I think the community has realized that our ability and responsibility to impact the world reaches far beyond the technical.

For years now the community has been helping each other. Community members with problems get help, from money to marrow. Now we have begun looking outward to others who need our knowledge and experience if they are to get any help.

I like this recent development and so do most in the community. So look for more impact from hackers coming soon to a problem space near you!