san francisco's building-street interface

Benjamin Grant, writing for Spur, on the downsides of San Francisco's urban fabric. A beautiful city, with a thoughtless approach to public space:

In a typical San Francisco street, both of these transitions [parking lanes and stoops] are absent or compressed. Curb cuts often preclude trees, curbs, and parking, exposing pedestrians to traffic and preventing the definition of a distinct pedestrian space. Buildings often lack a setback, reducing the privacy of residents and the comfort of pedestrians. Stoops and entry stairs are usually articulated inward, creating deep, cave-like spaces with poor security and no social use.

And, as San Francisco has a deeply ingrained relationship with its past, it seems unable to break the pattern.

The thoughtless and precipitous upheavals of mid-century urban renewal have made San Franciscans especially protective of the past. But the historic patterns of this city’s residential neighborhoods don’t offer especially good models for livable streets. And to a surprising extent, these patterns continue to be replicated in new construction as designers, developers and policymakers look to the city’s older neighborhoods as points of reference. Even as architectural styles, building codes, parking requirements and other parameters evolve, the urban fundamentals have remained surprisingly unchanged. It’s as if this pattern is woven into the city’s DNA, as if we are a city congenitally predisposed against good streets.

good design

Dieter Rams, speaking about his design principles:

Good design is innovative. Good design must be useful. Good design is aesthetic design. Good design makes a product understandable. Good design is honest. Good design is unobtrusive. Good design is long-lasting. Good design is consistent in every detail. Good design is environmentally friendly. And last but not least, good design is as little design as possible.

the plan

The other day someone asked me what I was going to do next. I said, “for the first time in my life, I haven’t really thought about that… for over a year.”

My philosophy on career planning is this: always be open to options. When an option presents itself, discern whether it is more interesting than the current path. If it is more interesting than the current path, pursue the option. If it is not more interesting than the current path, stay on the current path. People who are most interested in and engaged by what they are doing are the most successful. (And happiest.)

This philosophy is directly opposed to the career planning advice I got in college and grad school: make a 10 year plan and then follow it.

If you had told me 10 years ago that I’d be doing what I’m doing right now, and all that I’ve done along the way, I would have laughed all over you.


My generation suffers from a path addiction. (No, not that Path.) Thanks, in part, to over-parenting, we are placed on a path from the moment we can walk and we follow it. And it becomes delightfully comfortable.

We get put in preschool, then grade school, then we are prescribed extra-curriculars. Then we go to junior high school. High school comes next and then after that college. Always working toward the next step on the path. In college we are coached to develop “10-year plans.” Super-paths! And then we graduate. Real world. Shock. Horror.

Lately, as I have counseled friends on career decisions, and as I’ve started offering advice to recent college grads, I’ve learned something: we spend way too much of our early years knowing (and being prepared for) what’s coming next. When we finally get to the part where we have to fend for ourselves, develop a course of action and then execute, we fall down. Hard.

When offering guidance, I’ve too often been met with “but I’m not qualified for that” or “nobody would believe I’m capable of that.” We believe that, because someone else is doing something, they must have been anointed to do so. We think, “someone, somewhere validated that role.” I’d offer that the opposite is more likely true: most people aren’t qualified to be doing what they’re doing. Instead they were presented with an opportunity, and rather than waiting to be told they were ready, they simply jumped at the chance and figured it out as they went.

There’s a favorite passage from The West Wing:

Leo McGarry: Because I’m tired of it: year after year after year after year having to choose between the lesser of who cares. Of trying to get myself excited about a candidate who can speak in complete sentences. Of setting the bar so low, I can hardly bear to look at it. They say a good man can’t get elected President. I don’t believe that. Do you?

Jed Bartlet: And you think I’m that man?

Leo: Yes!

Bartlet: Doesn’t it matter that I’m not as sure?

Leo McGarry: Nah. “Act as if ye have faith and faith shall be given to you.” Put another way: Fake it ‘til you make it!

Fake it ’til you make it. Or: put down the path. Step away from the plan. Blaze your own trail. You are what you say you are – and nobody else needs to validate that. In fact, nobody else can.

our second screen obsession

Hollywood has grown fixated on the second screen. The “second screen” refers to the other thing we’re paying attention to while watching television or movies – this could be a phone, tablet, or computer.

The industry fixates on the second screen because it feels threatened, and rightly so. In an era of alt-tab attention spans, fewer people are coming to long-form content by traditional means if they are coming at all. Between the actual dollars, and the perceived mental commitment, the cost of consuming content that is longer than your regular cat video on YouTube is not negligible.

Further, over the last 30+ years, as distribution channels have expanded (the movie multiplex, the 500+ cable television channels), quantity of content has increased while quality of content has decreased. Fortunately for Hollywood, this bad content didn’t have much to compete with in order to capture an audience.

In the late 1990s, we saw wide-spread consumer adoption of the Internet. And the rise of file sharing. And the slow-and-then-fast disruption of the recorded music industry.

Hollywood watched this happen.

Then came broadband. And suddenly it was easy to share and stream the larger video files that dialup couldn’t support. And Hollywood started to worry, mostly, and incorrectly, about piracy.

Then came Facebook. And suddenly there was a whole new type of content that was, in many cases, far more compelling than Hollywood’s. This content is always fresh, and, most importantly, it is free. This is social media.

Then came the iPhone. And suddenly consumers could take this new stream of social content with them wherever they went – including the couch and the movie theater.

And now we have arrived at the present model of consumer behavior where viewers are watching Hulu in one tab while they’re Facebooking in another, they’re playing Angry Birds on the couch while they’ve got ESPN on on the television, and they’re texting each other while they’re in the movie theater.

And, bizarrely, Hollywood’s reaction to this is: LET US OWN THIS NEW SCREEN.

Thus is born the “innovation group” at BigMediaCo. Thus do ex-producers decide to launch startups to “capture the second screen experience.” Thus does an entire industry begin to wonder if this Silicon Valley place might become “a thing.”

Every few weeks, I get pitched a second screen opportunity, mostly by people who are content creators in Hollywood. Most of these businesses provide access to bonus content related to the first-screen experience, allow viewers to socialize with other people who are also consuming the first-screen experience, or, most commonly, create new revenue streams related to the first-screen experience.

And it’s started to make me wonder… are we stupid?

The job of Hollywood is to create content (read: tell a story) that so delights an audience that the audience is willing to pay to consume this content. Why is Hollywood spending any time creating experiences to further distract the audience? Rather than fixate on the shiny new thing, shouldn’t Hollywood be doubling down on finding and telling incredible stories? Wouldn’t we put our second screens down in a heartbeat if we were actually being delighted by the content on the first screen?

Social media has made it nearly impossible to recoup the investment in bad content. Mediocre opening weekends of a bad movie thanks to a big marketing spend are no longer possible. Twitter kills a bad movie after the first few screenings on Friday. But content is still king. And good content spreads itself. Social media has also spawned a whole new consumer behavior: Incredible television can now be discovered and binge-viewed season-by-season online. (see: Downton Abbey).

Let’s pause. Let’s not take our eye off the ball. Let’s not pretend that the second screen is going to solve some underlying structural problems in the entertainment industry. Let’s instead be confident that we can tell wonderful stories, delight our audiences, and that in return our audiences will reward us with dollars and, more importantly, their undivided attention.