A few months ago I heard and got excited about WindyGrid, a “situational awareness” system for the city of Chicago. According to this description, it’s a geographic information system that “presents a unified view of city operations—past and present—across a map of Chicago, giving key personnel access to all of the city’s spatial data, historically and in real time.” The system contains information on 911 and 311 service calls, transit and mobile asset locations, building status, tweets by geographical origin, and so forth. The system was designed by Brett Goldstein, then the city’s Chief Data Officer, who has since left the job.
I was initially enamored of this system because the notion of “all the data you need for situational awareness” is quite seductive. Imagine sitting behind your screen, watching the dials and lights—I imagine green, yellow, and red coloring—and monitoring everything that matters. You don’t even have to go out into the cold, windy, Chicago streets to figure out what’s going on! You’re situationally intelligent, which is far superior to being situationally ignorant.
I let this idea percolate in my mind for a few months, and resisted the tendency to write a laudatory column about WindyGrid. I even approached Goldstein at one point, but never actually interviewed him.
Over time I changed my mind about situational awareness systems in general, and probably WindyGrid in particular, though I’m sure it’s a useful system. I don’t think there is anything wrong with such systems except for the fact that they are called “situational awareness.” My current feeling is that to be aware of the situation, you need a lot more than data from your operational systems. In fact, the very act of building and closely monitoring a system means that you are unlikely to be situationally aware.
Here’s something that influenced my views on this. I read recently about a Navy jet fighter pilot, Lt. Nathan Poloski, who took flight last September from an aircraft carrier shortly after another plane had taken off. After a few minutes, Lt. Poloski had a mid-air collision with the first plane that preceded it at takeoff, and he died in the crash. The Navy investigation on the incident suggested that Lt. Poloski (and the other pilot as well, who survived) might have averted the collision had he used more “situational awareness”—that is, he should have looked out the window of his aircraft.
The term is often used in the military and in piloting to mean a complete perception of what is going on around someone. In order to achieve it, a pilot, for example, needs to consult instruments, computers, and navigational aids inside the cockpit, but he or she may often find it useful to gaze out the window now and then. A business or organizational (or city) leader should do the same thing.
Situational awareness is also a term that is widely used in the intelligence industry. If you’re trying to stop terrorism, for example, you’re encouraged to use all the situational awareness you can get. Now there is a trend in intelligence to rely increasingly on signals intelligence or “sigint.” That’s all the data from our phone calls and emails and text messages that intelligence agencies can intercept. It’s definitely a useful source, but “humint”—human intelligence—is also critical for situational awareness. If you are trying to figure out what actions ISIS or Al Qaeda will take next, you need to actually get out and talk to people.
When you have a technology system for situational awareness, I believe there’s a natural tendency to think that monitoring the system or analyzing the data is all you have to do. But the very act of systematizing data sources means that you may be missing something important, or even the big picture of what’s happening in your environment. It’s also important to turn off all the devices and systems occasionally and just think about how everything fits together. Human brains are good at that; computers are not.
There are, of course, some types of systems that make it easier to scan the external environment to better understand what’s going on. I’ve written about Recorded Future, a company I advise that scans and analyzes Internet text to better understand what people are saying and doing around the world, particularly with regard to intelligence and cybersecurity. It’s a very useful tool, but I would still advise looking out the window, walking down the street, or talking to the person at the next barstool now and then.
You may even want more systematic approaches to increasing your situational awareness. Peter Drucker used to love to tell the story of what Alfred P. Sloan, the legendary head of General Motors, used to do on his vacations. Instead of lying on the beach or hitting balls around a golf course, he’d pop into a GM dealership, talk to the sales folks, and even pretend to be a salesperson and work with customers. At one point he was visiting between five and ten dealers a day. He learned a lot about the business that way, and in those days GM was firing on all cylinders. His information systems may not have been as good as ours today, but his situational awareness was great.
Originally published in WSJ’s CIO Journal