Angie’s List: The problem isn’t ride hailing, it’s the lack of road pricing

Streetsblogger extraordinaire Angie Schmidt is not happy with Uber and Lyft. They’re not really the ones to blame.

Are Uber and Lyft to blame for growing urban transportation problems? Streetsblog’s Angie Schmit makes a strong case that they’re the villains her February 4 article starts out tough:

All the bad things about Uber and Lyft in one simple list: More traffic, less transit trips, more traffic deaths greater social stratification: A Comprehensive list’s It’s long.

And gets tougher.

Here’s the latest evidence that Uber and Lyft are destroying our world

Angie’s list has a pretty damning bill of particulars.  She ticks off ride-hailing for increasing driving, dead-heading, operating in transit-served areas, reducing transit use, increasing crashes, reducing cycling and walking, and hoarding data. There’s no question in our minds that too many cars are a big problem for cities, for transportation, for equity, for safety and for the environment. We think it’s important to look deeper at the underlying causes of “too many cars” rather than to focus blame just on ride-hailing.

At City Observatory, we’re huge fans of Streetsblog, the nation’s go to sounding board for transportation policy, and regularly count on ace reporter Angie Schmidt to both bring us up to date on what’s happening around the country, as well as offering trenchant observations. It’s rare that we’re in anything but enthusiastic agreement with her posts. This is one of those rare exceptions.

Don’t hate the player, hate the game

Rather than being the cause of our urban transportation problems, ride-hailing services like Uber and Lyft are symptomatic of underlying deep flaws in transportation, the most important of which is that we don’t price road space, particularly at the peak hour and particularly in dense urban environments at anything approaching its value.

There’s huge demand to travel in cities, and it has really only been checked by the price of parking and the (historic) numerical limits on for-hire vehicles.  As we’ve pointed out at City Observatory, parking prices serve as surrogate road pricing, and discourage people from driving their personal vehicles to downtown areas. Similarly, the limited number (and relatively high price) of taxis under the medallion system meant that taxis weren’t economically or numerically likely to overwhelm city streets.

The high cost of parking in city centers is a key reason that people ride transit. When bus fare is lower than parking costs, people tend to ride transit.  When parking is free, people drive their private automobiles.

There’s always been a huge, unrequited demand for peak hour travel in urban cores.  People are willing to pay a premium to travel in these spaces–which is why ride-hailing firms are able to impose surge pricing.

What Uber and Lyft have done is evade the two big limits on bringing more cars into central cities. They don’t have to pay for parking, because they don’t park.  And they’ve side-stepped the numerical limits on the number of for-hire vehicles allowed, which have been imposed by medallion schemes (although New York has recently capped the number of ride-hailed vehicle).

But let’s be clear–the underlying problem is not so much Uber and Lyft per se, it’s the fact that this very valuable, scarce resource–city streets–is un priced.  These companies are monetizing and capturing that value for themselves because we don’t charge anyone to use the streets.

And of course, our current debate about ride hailed vehicles is just a small scale dress-rehearsal of the challenges we’ll face when autonomous cars become plentiful. Fleets of autonomous cars will overwhelm city streets if those streets aren’t priced. Not having to pay a driver will cut fares more than in half from current levels (to as little as 25 to 40 cents per mile, cheaper than transit fares), and it’s likely that even owners of private cars will find it cheaper to have them slowly circle the block rather than pay for parking.

Imposing road pricing (or surcharges) just on ride-hailed vehicles or taxis misses the point that privately owned vehicles cause just as much congestion as ride-hailed ones. Trip taking is similar, and cruising for parking is probably as big a source of “wasted” miles in city centers as is dead-heading or cruising for fares.

To be sure, by tapping the latent demand that was held in check by parking prices and taxi limits, the ride-hailed firms have slowed traffic in downtowns like New York and San Francisco. And that has the knock on effect of reducing bus speeds and productivity, which probably explains part of the reason transit ridership is down (as does lower gas prices).  While it may be emotionally satisfying to paint Uber and Lyft as the villains here, the real problem is our unwillingness to price streets.

In at least one important respect, Uber and Lyft have performed a very important public service: they’ve educated millions of Americans about the marginal cost of automobile travel. Every ride-hailed trip is itemized, and priced by the mile and the minute. And surge pricing begins to show the high value (and cost) associated with travel in peak demand periods (although it allows the companies to capture this economic rent, rather than give it to the public, where it belongs). Moving in that direction is where we’ll find a solution to all of the problems on Angie’s list: The more car use is priced by the mile, and reflects congestion costs, the more efficient our transportation system will be.

Although they’ve been far from model corporate citizens in many respects, to their credit, both Uber and Lyft have endorsed road pricing. Rather than simply vilify them and ignore the more fundamental problem, we ought to be working together to fix a flawed system. It would help to do it before autonomous vehicles make this problem an order of magnitude worse than it is now.

Backfire: How widening freeways can make traffic congestion worse

Widening  I-5 in Portland apparently made traffic congestion worse

Oregon’s Department of Transportation (ODOT) is proposing to spend half a billion dollars to add two lanes to Interstate 5 at the Rose Quarter in Portland, with the hope that it will help relieve traffic congestion. But practical experience with freeway widenings in this area shows that more capacity actually makes the traffic worse. Today we show evidence that when ODOT widened I-5 between Lombard and Victory Boulevard a few years ago, it only managed to funnel more traffic more quickly into the I-5 Columbia River bridge chokepoint. The result: the bridge actually carried less peak hour traffic than before.

If a sink isn’t emptying rapidly enough, pouring more water into it only causes it to overflow more.

A bit of orientation, Interstate 5 is one of two major freeway connections between Vancouver, Washington and Portland, Oregon. There’s a large daily flow of commuters from residences north of the river in Washington to jobs in Oregon. Travel across the I-5 bridge (and I-205, a parallel route some five miles to the east) is heavily southbound in the morning and northbound in the evening. As in most US cities, PM peaks are more pronounced and travel slower than in morning peaks.

As we related in an earlier commentary, in 2010, the Oregon Department of Transportation completed an $70 million dollar project to widen a mile long portion of I-5 between Lombard and Victory Boulevard in North Portland, to, in their words eliminate a bottleneck.

Our earlier analysis examined the traffic crash record for that stretch of freeway, noting that rather than decreasing crashes, the crash rate actually increased after the freeway was widened. So that part of the project didn’t work.  But did “fixing” the bottleneck make the freeway work better?

Today, we take a look at traffic flows across the Columbia River I-5 Bridge, just north of the freeway widening project. In theory, removing the bottleneck should cause traffic to flow more freely.

But what appears to have happened is that the wider I-5 just funneled more peak hour traffic, more quickly into the bridge area. The result is that the roadway jams up more quickly, and that backups occur earlier and last longer, with the result that the freeway actually carries fewer cars than it could if traffic volumes we effectively metered more evenly by a somewhat lower capacity upstream of the bridge.

As traffic engineers well know, there’s a kind of backward bending speed-volume relationship.  A highway can carry a certain amount of traffic at a relatively high speed, but as more cars are added, the freeway both slows down–and loses capacity. What additional capacity can do is more quickly push a highway past this “tipping point” resulting in slower traffic and lower throughput.

Source: Washington State Department of Transportation

Data provided by the Clark County Washington Regional Transportation Council (RTC) seems to show that’s just what happened on I-5 after the 2010 widening project was completed. Prior to the completion of the project, the Northbound peak hour flows over the I-5 bridges were always somewhat greater than 5,000 vehicles per hour, fluctuating between 5,000 and 5,500 vehicles per hour on a typical peak hour on a Thursday afternoon (a data point selected as typical by RTC). The following chart shows peak hour traffic on I-5 in October 2006, with the blue line corresponding to Northbound traffic.

Notice that for the three lanes of I-5, 5,000 vehicles per hour works out to about 1,700 vehicles per lane per hour, squarely in the “yellow zone” where traffic speeds and capacity become unstable. These data also show that the morning and afternoon peak volumes are approximately equal (topping out at just over 5,000 vehicles.

Here’s a similar chart for 2016 (the most recent year available. On this typical Thursday, Northbound traffic never reached 5,000 vehicles per hour. Notice that the morning peak remains at its earlier level–it’s only the afternoon volumes that have fallen. This 10 percent reduction in throughput over the I-5 bridges in the Northbound Direction in the afternoon peak is likely a result of funneling additional traffic on to I-5, thanks to the freeway widening and ramp “improvements” ODOT put in place since 2006.

The Regional Transportation Council has generated similar charts for selected years between 1983 and 2016.  (They’ve even got a clever animation of these charts showing how the pattern of traffic has changed over time). We’ve aggregated the data for all the years reported by RTC into a single chart showing the maximum PM peak hour volume traveling Northbound across the I-5 bridges.

After 2010, peak hour volumes on the I-5 Northbound have been consistently below 5,000 vehicles per hour, ranging between 4,400 and 4,600 between 2012 and 2016.  (Again, RTC selected data for even numbered years).  What these data show is that the hourly volume of cars crossing the I-5 bridges at the peak hour has fallen close to 10 percent since the bottleneck was removed.

Again, as we noted earlier, southbound traffic in the morning peak hour continues to flow at a rate of about 5,000 vehicles per hour. Our statistical analysis is admittedly a first brush, but we’ve seen nothing in ODOT’s analysis of I-5 operations that suggests its incorrect.

This analysis points up the futility of “bottleneck busting” incremental freeway expansion.  Widening a freeway at one point simply delivers more traffic, faster, to the next bottleneck in the system, causing it to be the new source of the problem.

Ironically, bottlenecks at one point in the system act as “meters” to control the flow of traffic on to subsequent sections of the roadway. Delaying traffic at one point–as we do intentionally with ramp meters–allows the downstream sections of the roadway to flow without exceeding capacity and moving into the backward bending part of the speed/volume relationship.

The beneficial effects of this metering process are apparent in Seattle’s recent experience with the closure of the Alaskan Way Viaduct.  This limited access highway, which carried 90,000 vehicles per data through downtown Seattle (I-5 at the Rose Quarter carries about 121,000) was closed in mid January 2019. Despite predictions of “Viadoom,: based on the theory that traffic would spill over onto adjacent city streets and overwhelm the parallel segment of I-5 in Seattle, traffic in Seattle was, if anything, somewhat better than before the viaduct closed. The Seattle Times reported, “the cars just disappeared.” By shutting down the flow of traffic from the viaduct to Seattle streets, the closure reduced the demand on those streets and enabled traffic to flow more smoothly.

The practical experience with widening I-5 shows that eliminating bottlenecks in one place simply leads to the more rapid congestion of the next downstream bottleneck, and ironically, lower throughput on the freeway system.  It might seem paradoxical that highway engineers would allow this to happen, but if you’re more interested in generating excuses to build things, rather than actually managing traffic flows, it makes some sense.  As we’ve argued before, it seems as if highway engineers treat the sisyphean aspects of perpetually chasing bottlenecks, not as a bug, but as a feature. To them, the fact that widening one stretch of freeway to eliminate one bottleneck simply creates another one is a guarantee of permanent employment, not a fundamental flaw in engineering practice.

Rose Quarter freeway widening won’t reduce congestion

Spending half a billion dollars to widen a mile of I-5 will have exactly zero effect on daily congestion.

The biggest transportation project moving forward in downtown Portland isn’t something related to transit, or cycling (or even bringing back shared electric scooters). It’s a proposal to spend half a billion dollars widening a mile long stretch of Interstate 5 adjacent to the city’s Rose Quarter.

Build it and they will drive. Wider freeways produce more traffic, not less congestion.

The project’s been advertised as a congestion-busting, bottleneck removal project. But sadly, even if the state spends a half billion dollars here, daily traffic conditions won’t improve. We know that’s true because of the well-documented phenomenon of induced demand. And as it turns out, even state and local transportation experts conceded that will be the case.

Induced Demand:  The futility of widening freeways

Time and again, cities around the US and around the world have widened freeways with the avowed purpose of reducing congestion. And its never worked. One need look no further that the current US record holder for widest freeway, Houston’s 23-lane Katy freeway.  It was most recently expanded in 2010 at a cost of $2.8 billion to reduce congestion. It was even touted by road-building advocates as a poster child for freeway widening projects. But, as we’ve reported at City Observatory, less than three years after it opened, peak hour travel times on the Katy Freeway were 30 to 55 percent longer than they had been before the freeway was widened. The added capacity was swamped by induced demand, and congestion–and pollution and sprawl–were worse than ever.

No matter how many lanes you add, it ends up like this.

If there were ever any doubt that this was the case, all one had to do was pay attention to what happened to Seattle when the city abruptly closed its Alaskan Way viaduct, a limited access highway carrying about 90,000 cars a day through the city’s downtown. (For reference, I-5 at the Rose Quarter handles 121,000).  City leaders warned of Carmageddon, gridlock and Viadoom, that downtown streets and freeways would be overwhelmed by the traffic usually carried on the viaduct. But in the two weeks following the viaduct’s closure, traffic was at or below normal, as the Seattle Times reported “the cars just disappeared.” The reason is the inverse of induced demand: when you reduce the amount of urban freeway space, traffic does simply back up, it actually goes away (as people take other modes, change when they take their trips, substitute more local destinations for further away ones, and consolidate trips). Far from be a fixed quantity, traffic is like a gas that expands to fill the space available.

This phenomenon is now so well-documented that it is referred to in the published journals of economics as “The Fundamental Law of Road Congestion.” Adding more un-priced highway capacity in urban settings only generates more traffic and does nothing to lower congestion levels.

Even agency experts agree: It’s futile and won’t fix daily congestion

Even the staff of the two agencies most responsible for the project concede that this is the case. Mauricio LeClerc is a principal transportation expert for the Portland Bureau of Transportation. Here’s his testimony to the Portland Planning and Sustainability Commission.

When we did the analysis, the congestion benefit is on the elimination of crashes—non-recurring congestion. The congestion benefit of just adding more lanes was very limited.
Basically you’re fixing something.  Certainly, there’s an improvement, but it’s not very large.  If you are familiar with the freeway system, it’s congested to the north, it’s congested to the south, and if you’re going to I-84, it’s just going to be congested as you enter I-84.  So, it has limited utility, but it does have a very significant safety and non-recurring congestion benefit.  So, we’re not sure what the induced demand, if that gets modeled, it’s potential, but it’s not very large.
(Emphasis added)
Portland Planning and Sustainability Commission
February 28, 2017,; at 37:00

This point was also confirmed by Travis Brouwer, a spokesman for the Oregon Department of Transportation, in response to questions posed by Jeff Mapes of Oregon Public Broadcasting.

Jeff Mapes:  It’s interesting, ODOT’s arguments—that’s the Oregon Department of Transportation—you know they’ve shifted a bit since the battleground has shifted now from the State Legislature to the City of Portland.  And they’re emphasizing now more the safety concerns—there are a lot of crashes there—but frankly the large majority of them are fender benders and that sort of thing, and secondly, but basically, they are saying if you take care of a lot of those fender benders, its; going to reduce a lot of delays that frequently happen there.
Here’s Travis Brouwer, he’s the assistant director of ODOT. He makes sort of their subtle case for the project, I guess:
Travis Brouwer: We fully admit that this is not going to eliminate congestion at the Rose Quarter, But, we do expect it will make traffic a lot better.
OPB Politics Now, October 12, 2017
(Emphasis added).

There’s a bit of nuance here that both LeClerc and Brouwer are alluding to:  The project won’t reduce congestion, except perhaps congestion related to crashes. You’ll notice that LeClerc makes reference to the congestion benefit of the elimination of crashes and “non-recurring congestion benefit.” Here’s the translation from engineering speak:  Roads get jammed up for two reasons:  first, the regular daily flood of traffic at the peak hour, and second when there’s a crash. What LeClerc and Brouwer are saying is this project will do nothing to reduce the regular daily traffic jams on I-5. As to that non-recurring component, lowering congestion by reducing crashes–we’ll take a close look at that in part II of this analysis.

A wider freeway won’t mean less daily traffic congestion.  Even though it seems like spending half a billion dollars ought to make a difference it won’t.  The Rose Quarter freeway widening project is either  a half-billion dollar ritual sacrifice to the freeway gods, or the world’s most expensive piece of performance art. But there is one thing it is surely not: any kind of a solution to daily congestion on a freeway at the center of one of the nation’s most vibrant metropolitan areas.

 

Time to get real about climate change

To change the world, we need to change the world…

Editor’s Note:  Ethan Seltzer is an Emeritus Professor in the Toulan School of Urban Studies and Planning at Portland State University. He previously served as the President of the City of Portland Planning Commission and as the Land Use Supervisor for Metro, the regional government. He has lived and worked in Oregon and the Portland region since 1980 and is a contributor to City Observatory. We’re pleased to present Ethan’s latest commentary here:

On a plain reading of the evidence, climate change is occurring in real time. Its effects are being felt, in Oregon and around the world, today and not in some distant and uncertain future.

Transportation GHG emissions have risen during each of the past three years and have grown from 35% of the statewide total in 2014 to 39% in 2016.

Oregon and the nation are off track in curbing vehicle greenhouse gas emissions and straying further away from the necessary pace every day. While electric vehicle sales are ramping up, new gasoline-fueled SUVs are entering the national fleet in far greater numbers. Even California, considered by many to be at the forefront of GHG reduction efforts, is seeing transportation emissions headed upward.

Oregon Global Warming Commission, 2018 Biennial Report to the Legislature for the 2019 Oregon Legislative Session

Recently the Oregon Global Warming Commission made its report to the 2019 Oregon Legislature. Among its findings were two stark conclusions: first, that Oregon was not on track to meet its 2050 carbon emission goals, and second, that rising emissions from the transportation sector particularly were moving us wildly off course.

Transportation is now the largest source for carbon emissions, with passenger car and truck emissions accounting for over half of what the transportation sector emits overall. With a strong economy and continuing population growth, our transportation related climate emissions are soaring despite reported improvements in both fuels and overall vehicle efficiency.

Now, with the Intergovernmental Panel on Climate Change’s most recent report, flagging the need for rapid progress towards reducing emissions by 2030, not 2050, Oregon’s subpar performance lends new urgency to efforts to make real change in our use of fuels in order to reduce, substantially, associated carbon emissions. Simply put, electrification of the vehicle fleet, clean fuels, and other technological fixes are important, but won’t enable us to meet our goals, goals appearing to be less ambitious then they need to be by the day.

What Oregon needs now is not just a low-carbon future, but a low-car future.

The good news, if you can call it that, is that we know what we need to do, and what we need to do is largely within our grasp. Decreasing our over-reliance on travel by car and light truck, the pivotal points of consumption identified by the Oregon Global Warming Commission, is something we don’t need the Trump administration to facilitate, and it’s not in the hip pocket of the oil or auto lobbies either.

In addition to making travel a smaller source of carbon emissions, we need to make travel less necessary for sustaining our households and achieving our dreams. For some time, we’ve known that there is a close connection between urban form, the way we’ve physically arranged the places we live, work, and play, and the degree to which we need to slide behind the wheel of a car to make it all mesh. Twenty-five years ago most transportation plans were developed with the expectation that land use would never change, and that the job of infrastructure was to chase new sources of demand for service.

Today we know that where things happen has a huge impact on whether we need more infrastructure to facilitate more interconnections, or less. In a scant quarter of a century, our understanding of “vehicle miles travelled” has gone from a blunt measure of system function to a variable in an equation we, and our communities, can direct. While it used to be revolutionary to posit changes in urban form as a strategy for obviating or minimizing the need for new investments in infrastructure, today it’s seen as a commonsense approach to deploying scarce public resources.

Add to changes in urban form a progressive approach to pricing the use of infrastructure already coursing through our communities, and we have two powerful levers with which we, at the local level, can dramatically affect Oregon’s ability to meet its carbon emission goals. No dramatic technological breakthroughs are needed, only dramatic breakthroughs in political will. We have only ourselves to blame for not making great strides towards the emissions goals we say we seek to serve.

Which is why it’s so perplexing to see so many on the right side of the climate issue so unwilling to advocate for meaningful reductions in vehicle miles travelled as a primary vehicle for action and results. How, for example, as both Jenny Schuetz writing the Brookings Institution’s blog “The Avenue,” and Alex Baca, writing for Slate, have pointed out, can a progressive “Green New Deal” completely leave out land use and urban form as key strategies?

As Baca points out, leaving out urban form is not just technically nuts, it also perpetuates longstanding patterns of inequality and racial injustice in our metropolitan regions, patterns that proponents of the Green New Deal suggest are, in fact, the very things it will address. Or, as Steven Higashide writes in The Hill, “It’s not enough to build more transit, as long as federal policy continues to subsidize the highway-and-sprawl machine.” As he puts it, the “highways as usual” policies of the Federal government, reinforced by the inertia of state DOTs, have to change for infrastructure policy in this country to become a tool for reducing carbon emissions associated with transportation.

Both these authors and the Oregon Global Warming Commission have found the same thing: transportation emissions are going to frustrate the best of our climate goals unless we find ways to diminish our dependence on auto travel for every minute aspect of our modern lives.

How ironic that in this age of the internet we are actually moving backwards on meeting our transportation-related climate targets. And how tragic that our investiture of hope in technology is diverting too many of us from addressing both the real challenges and finding real solutions to our inability to meet our carbon goals: reducing vehicle miles travelled in our metropolitan regions.

Climate activists would be well-advised to engage land use planning and urban redevelopment as targets for fundamentally changing the trajectory for local carbon emissions. We know how to design and build cities to favor walking, biking, and transit over car travel. We know how to do it in ways that make our cities both more efficient and, at the same time, healthier and more just for everyone.

And, thanks to years of moving money from the Federal level to the states and localities to build highways, we have created systems that rely on single points of contact for planning and allocating funds—local, state, and federal—to regional transportation systems. These “metropolitan planning organizations” for transportation are an activist’s dream: one-stop points of leverage to insist on investments for reducing rather than increasing, consciously or not, vehicle miles travelled in our communities.

In sum, the way we travel, not to see Aunt Mae but on a daily, weekly basis, is killing us and our hopes for the future. It’s not hypothetical, it’s real. The solutions are not theoretical, we know what they are. And without them, the best of our technological fixes will remain major disappointments. Changing the world we live in everyday is what we, together, have to do.

 

The high price of cheap gasoline

When gas prices stopped diving, Americans again began to drive less

The most fundamental point in economics is that people respond to incentives. Make something cheaper to buy, and people will buy more of it. Make something more expensive, and they’ll buy less. That’s plainly the case when it comes to driving, and one of the biggest and most visible costs of driving is the price of a gallon of gas.

The relationship between prices and driving isn’t perfect and instantaneous. People make decisions about where to live, how far they’re willing to commute to work, whether to own a car (or a second car) and whether to use various other modes (cycling, transit and walking) on a long-term basis. But, especially over time, prices influence all of these decisions.

There’s a tendency in much of the literature to treat the elasticity of vehicle miles traveled relative to gas prices as something that adjust quickly on a daily or weekly basis.  But consumers are more influenced by larger and longer term changes in prices than short term fluctuations.  Day-to-day and week-to-week price changes are likely to be “infra-marginal”–too small to notice, and therefore too small to affect behavior. If the price of gas this week is a nickel or a dime less per gallon that a week ago, I’m not likely to do anything different.  But when prices are either much higher or much lower for an extended period of time–say a dollar a gallon higher than last year at this time, I’m much more likely to factor that into my decisions about how much to drive, what kind of vehicle I buy, where I look for housing or my next job and so on.  In essence, we can and should safely ignore the short term fluctuations, but pay a lot of attention to the longer term patterns.

Let’s take a look at the connection between gas prices and driving in the US over the past couple of decades. Our data come from the US Department of Transportation (which produces monthly estimates of the total number of miles driven in the US) and from the US Department of Energy, which tracks the retail price of gasoline on a weekly basis. To account for the effects of population growth over time, we’ve divided total miles driven by the US population, and estimated the number of miles driven per person per day in the nation. Gas prices are expressed in current dollars (i.e. not adjusted for inflation). On the following chart, gas prices are shown in red, and per person daily vehicle miles of travel are shown in blue.

Long Term Patterns of Vehicle Travel:  Four Phases

Over the past two decades, we’ve experienced four distinct phases in the price of gasoline and the attendant patterns of American driving.

Phase 1:  2000-2005:  The era of cheap gas.  Until 2004, gas prices in the US were less than $2 per gallon.  The consistent low price of gasoline led to steady increases in driving by Americans.  Toward the end of this phase, gas prices had risen somewhat, but these rises only slowly had an effect on travel behavior. By this measure, “peak driving” in the US was in June 2005, when Americans drove 27.7 miles per person per day. At the time, gasoline cost an average of about $2.13 per gallon. But by 2005, with gas prices exceeding $2 per gallon, the growth of travel slowed and then flattened out entirely.

Phase 2:  2005-2014:  Expensive gas:  From 2005 onward, gas prices were much higher than in the previous era.  American’s travel started to decline on a sustained basis, and then, with a spike of gas prices to more than $4 per gallon, combined with the great recession, vehicle miles traveled per person fell dramatically.  And even as the recession ended (in 2009), per person travel continued to decline.  During the first five years of the recovery, through 2014, gas prices rose back to more than $3.50 per gallon, and driving continued its slow and steady decline. By 2013, the typical American was driving about 25.7 miles, more than 2 miles per person per day less than at the peak.By 2015, Americans were driving fewer miles than any time in the previous 15 years.

Phase 3:  2014-2016:  Cheaper gas again. In the middle of 2014, oil prices–which had hovered near $100 a barrel––suddenly collapsed to less than half that level, and gas prices fell with them. Consumers reacted quickly. In April 2014, gas prices averaged more than $3.70 a gallon, and people drove an average of 25.7 miles per day. Some 22 months later, in February 2016, with prices averaging about $1.75 a gallon, consumers were driving about 26.7 miles per day, about 4 percent more.

Phase 4:  2016-2018:  A rebound in gas prices.  Over the past two years, gas prices have again trended upward, rising from less than $2 per gallon to nearly $3 per gallon.  As they’ve done so, the growth in per capita driving has slowed, and–once again–reversed.  Vehicle miles traveled per person peaked in late 2017, and have been trending down again since then.

The lesson here is clear:  Cheap gas produces more driving; Expensive gas leads Americans to drive less.  That fundamental relationship has important implications for the fiscal, social, and environmental consequences of car transportation.  As we’ve noted at City Observatory, more driving is directly implicated for the surge in road fatalities in the past five years. Cheap gas also generates more car traffic, congesting roadways and taxing infrastructure.  And with more miles driven by heavier vehicles, cheap gas is directly responsible for the growth in greenhouse gas emissions.

There’s a tendency to dismiss the importance of the price elasticity of travel demand by overemphasizing the weak short term relationship between gas prices and driving. But as the lesson of these four phases clearly illustrates, that’s a mistake.  Sustained high prices for gasoline lead to real reductions in vehicle miles traveled, in pollution and in car deaths.  If we price travel appropriately, consumers will make different decisions–one’s that significantly reduce the social and environmental costs of car travel.  Prices matter and should be at the heart of all of our efforts to cope with climate change and build stronger and safer communities.

 

 

 

A wider freeway won’t reduce traffic

Widening I-5 actually increased crashes, instead of reducing them, and an even wider freeway won’t be less congested if crashes don’t decline.

We’re going to dig deep into Portland’s proposed freeway-widening controversy today, and in the process we’re going to get into some very wonky traffic engineering details. Here’s the background: the Oregon Department of Transportation is proposing to widen a mile-long stretch of Interstate 5 through Portland, at a cost estimated approaching $500 million. ODOT is offering up a shifting array of rationales for the project.  While they conceded that the project won’t reduce the regular daily traffic jams due to induced demand as we pointed out earlier, they argue that it will relieve congestion due by reducing crashes. The theory is that a wider road will have fewer crashes.

The project’s advocates have acknowledged that widening I-5 will do nothing to reduce the daily backups on I-5 that are associated with the heavy flows of commuter traffic. Instead, they’ve build the case for this project on its ability to reduce what they call “non-recurring” congestion–the delays associated with back ups due to crashes.

(For traffic engineers, congestion comes in two flavors–”recurring” and “non-recurring”.  “Recurring” congestion is the predictable daily (usually twice daily) slowing on a roadway that’s associated with the heavy demand from regular flows of commuters. “Non-recurring” means unusual congestion, the kind that’s associated with crashes, construction slow-downs or bad weather. While the distinction is almost certainly irrelevant to those of us stuck in traffic, its an essential part of the justification for this $500 million project. Superficially, it’s plausible theory, but is it true?

The best evidence of whether the ODOT theory is right is an actual experiment. What happens when you widen a stretch of urban freeway like this one. Do crashes actually decrease?

As luck would have it, we have a timely and close-by real world experiment to examine. In fact, this experiment is on the same roadway, in the same city, and involves exactly the same kind of improvements, designed to solve the same kind of problems. In 2009, the Oregon Department of Transportation spent $70 million widening a stretch of Interstate 5 between Lombard Street and North Victory AVenue. They added a third lane to one side of the freeway, and widened shoulders on both sides of the freeway. The ostensible purpose of the project was to alleviate congestion and reduce the fender benders that created non-recurring delay.

So if a wider freeway results in fewer crashes, we ought to see it in the data. Let’s take a look at ODOT’s crash data for this stretch of Interstate 5.  ODOT reports crashes on a roadway segment that runs from Lombard Street to the Oregon/Washington border; the project in question represents about half of this segment. Here are the ODOT data on the number of crashes in this roadway.

The data show that prior to the project, this stretch of roadway experienced about 1 crash per 1 million miles driven, with some fluctuation from year to year between 0.9 and 1.1 crashes. Perhaps unsurprisingly, during the project construction period, which included calendar year 2010, the crash rate went up. After the project was completed, the crash rate came down, but has averaged about 1.1 crashes per million miles traveled, perhaps about 10 percent higher than the pre-construction equilibrium.

The important point here is that widening this particular stretch of freeway didn’t do anything to reduce the actual number of crashes recorded. If anything, the crash rate went up.

That’s got a very important implication for the proposed $500 million Rose Quarter I-5 widening project. This real world experience shows that more lanes and wider shoulders–on this very same freeway, carrying many of the same vehicles–does nothing to reduce the real world crash rate.

What this means is that neither of the supposed traffic improvement rationales for the I-5 widening project are supported by any evidence. The well-known effect of induced demand means that regular daily congestion will continue–a fact that state and local agency experts concede. Their claim that a wider freeway will somehow reduce crashes isn’t borne out by the actual evidence from ODOT’s last experiment with widening I-5–on a segment of road that carries virtually the same traffic and had (until 2010) the same kind of bottleneck. Instead, widening the freeway increased crashes. Because it will reduce neither recurring non non-recurring sources of congestion–and may actually make them both worse, it makes no sense to spend half a billion dollars on this project if the objective is to reduce congestion.

 

The Week Observed, February 1, 2019

What City Observatory did this week

1. The limits of our current approaches to providing affordable housing. We present a summary of some remarks offered by Rob Stewart, a principal with JBG Smith Real Estate, reflecting on his experience working on housing issues in Washington DC. Broadly speaking, there are two ways we tend to go about encouraging more housing affordability: the supply side approach of encouraging the market to build more housing, and the subsidized housing approach, offering tax credits and other support for the construction of new, purpose built, rent-restricted housing. In practice, our current systems of subsidizing affordable housing produce relatively few units (because new construction is so expensive) and also have tended to reinforce existing patterns of economic segregation (because new units get built in or near high poverty neighborhoods).

2. A third way approach to addressing housing affordability and integration. We explore a new proposal from the Washington Housing Initiative to combine private and public (or philanthropic) capital to buy existing apartments and maintain their affordability for low and moderate income families. By buying units in the likely path of revitalization, the plan can maintain affordability in exactly the kind of neighborhoods that are likely to offer greater opportunities for low and moderate income households. In addition, this strategy would break the tendency of existing subsidized housing programs to perpetuate patterns of economic segregation.

3. Dangerous by Design. It’s a very well-researched annual report, but it’s a grim one: Smart Growth America has once again tabulated the toll that our nation’s transportation system takes on pedestrians.  In the past decade, more than 5,000 Americans have been killed while walking on the nation’s streets and roads. While the death toll for those in cars has gone down, the toll for pedestrians is up sharply. More importantly, the patterns of deaths are neither random nor accidental: sprawling metro areas with car-dependent travel systems have consistently higher pedestrian death tolls. More driving means more dying. Higher-speed multi-lane arterials are a particular menace to pedestrians.

4. Happy Groundhog’s Day’s Oregon: You’re stuck in a loop on climate. Groundhog’s day is coming up again, and we take our usual annual look at how Oregon is doing in achieving its legally adopted goal of reducing greenhouse gas emissions to a fraction of their 1990 levels. Pledging allegiance to the goal is a standard feature of local politician’s Earth Day messages, but as in year’s past, progress is sadly lacking. The state’s Greenhouse Gas Commission reports that while the state has made some progress in reducing powerplant and industrial emissions, carbon from transportation continues to increase, and there’s essentially no way the state will ever reach its carbon reduction goals.

Must read

1. Oregon homebuilders question legalizing missing middle housing. Here’s a great story from our own backyard, reported by the Daily Journal of Commerce‘s Chuck Slothower and brought to wide attention by Iain MacKenzie of NextPortland. Oregon House Speaker Tina Kotek has introduced legislation to legalize duplex, triplex and fourplex housing in residential zones in larger cities in Oregon. You’d think that a group that builds homes, and purports to oppose regulation we be on board with this. Not so much:  Oregon Home Builders Association says they’re worried that the bill will undercut the case they’re trying to make for expanding urban growth boundaries to allow more low density housing on the periphery of metropolitan areas. That’s pretty cynical, to be sure, and shows that despite their name, the organization may not really be interested in seeing more homes built.  In addition, it undercuts the oft-repeated claim that consumers “prefer” suburban living: if the only way to sell more suburban houses is to block the construction of more housing in urban centers, that’s a pretty good indication that the only reason many consumers bought suburban single-family homes was because they had been denied the opportunity to purchase an affordable duplex, triplex or fourplex home.

2. How to fix congestion in Manhattan in five easy steps. The advent of ride-hailed vehicles has pushed New York City’s streets to the limit of what they can handle if traffic isn’t priced in some fashion.  In testimony to the New York City Council, Charles Komanoff outlines a five point plan for applying congestion to Manhattan.  The key: levelling the playing field among taxis, ride-hailed vehicles and privately owned cars. He proposes that rather than single out some modes for surcharges (like the ones imposed on ride-hailing) that all vehicles should pay for using congested city streets in Manhattan. Ride-hailed vehicles and taxis would pay per minute for their time in the center, while private vehicles would pay a cordon charge to enter. And to encourage greater efficiency by Uber and Lyft, Komanoff would add a surcharge for cruising city streets looking or waiting for fares. Komanoff’s modeling suggests this plan would increase traffic speeds and reduce congestion and raise about $1.7 billion annually to support the city’s struggling transit system.

3. Congestion pricing and the green dividend in Boston. The City of Boston’s Green Ribbon Commission has just released its report, recommending that in addition to electrifying buildings throughout the city (to enable them to use wind and solar power for heating and cooling), that the city should implement some form of congestion pricing, to reduce automobile use and encourage mass transit, walking and cycling. The plan also recognizes that higher urban densities would facilitate this shift and specifically urges the city to upzone urban neighborhoods. The task force also calculated a shift to greener, denser living would save Bostonians $600 million per year by 2050, through lower energy bills, a classic example of what we call the green dividend. (A hat tip to Curbed Boston).

New Knowledge

Pollution from cars and trucks lowers student performance in schools near highways. For decades, we’ve know about the negative health effects of air pollution, but a new economic study shows that air pollution imposes a significant cognitive cost on young learners. This study looked at the correlation between air pollution levels near schools and the academic performance of students in those schools. It found that students attending schools located near and downwind from busy highways had lower rates of academic performance, higher absenteeism and higher rates of disciplinary problems than those attending less polluted schools. The more traffic on nearby roads, the larger the decline in scores on state standardized tests.

 

The study use a sophisticated methodology to tease out the effects of school level pollution, looking at students who moved between schools.  They conclude:

. . . children who move to a school downwind of a major highway have lower test scores and a higher likelihood of behavioral incidents and missing school than when those same children attended schools with similar characteristics that were not downwind of a major highway. The effects are larger for more heavily-trafficked roads, and the effects appear to last even after the child moves away from a downwind school. This suggests that once damage from pollution is done, even during middle childhood, it might persist, potentially affecting outcomes far into the future.

Jennifer Heissel, Claudia Persico, David Simon, Does pollution drive achievement? The effect of traffic pollution on academic performance. NBER Working Paper No. 25489.

 

The Week Observed, February 8, 2019

What City Observatory did this week

1.  Measuring Anti-Social Capital.  Thanks to the scholarship of Harvard’s Robert Putnam, the idea of social capital has become firmly entrenched in the policy lexicon. Putnam and others developed some innovative measures of social capital, looking at voting, volunteering, and attitudes about civic affairs and behaviors of personal engagement. This week we update one of our favorite measures of “anti-social” capital:  the number of security guards per capita in each large US metropolitan area. The employment of security guards is a fundamental measure of the degree of distrust and suspicion in a community:  we need to hire guards to protect people and property and to deter people from engaging in anti-social behavior. There are some big variations among US metro areas in the number of security guards.  See where your city ranks.

An indicator of “anti-social” capital?

2. The Market Cap of Cities. Business analysts like to summarize the relative size and importance of publicly-traded companies by computing their “market cap” or market capitalization:  the aggregate dollar value of their shares based on current prices. These figures show that tech giants like Amazon, Google, and Apple are the biggest public companies around. We’ve used the same approach to compute the market cap of cities, as judged by their housing markets. Home prices are like apartment prices: they reflect the value the market attaches to living in different cities. Using Zillow’s estimates of single family home values and our own estimates of apartment values, we’ve estimated the market cap of the nation’s largest metro areas. Collectively, they’re worth $30 trillion–more than the fifty largest companies combined.

3. The proposed half-billion dollar freeway widening in Portland will do nothing to lessen daily traffic congestion. City Observatory readers are intimately familiar with induced demand, and know the “fundamental law of roadway congestion” that afflicts urban transportation. You can’t build your way out of congestion no matter how many free lanes you add because they’ll just attract more traffic. It turns out that even the two agencies proposing the $500 million mile-long widening of Interstate 5 in Portland concede that this is the case. We provide quotes from officials at the Portland Bureau of Transportation and the Oregon Department of Transportation acknowledging that the project won’t lessen recurring traffic jams.

Spend $500 million, and it will still look just like this at 5.15 pm.

4. The problems with Angie’s List. Streetblog’s Angie Schmidt has a long list of grievances against ride-hailing companies:  they’re adding to traffic, cutting into transit ridership, and more. There’s some validity to these concerns, but we think Uber and Lyft are not the problem. Instead, most of what’s laid at the feet of ride-hailing is really a result of our under-pricing of valuable and limited road space, particularly in cities and at peak hours. All of the perceived problems of ride hailing would be substantially reduced if we charged everyone for their use of this scarce and valuable resource. Don’t blame the player, blame the game.

Must read

1.  Why upzoning actually does matter to getting more affordable housing.  The cause celebre in planning circles this week was the publication of Yonah Freemark’s study of upzonings in Chicago. Freemark found that even though Chicago changed its zoning code to allow more density near transit stations, new development didn’t occur, and home prices went up. Though Freemark was much more guarded about his interpretation of the findings, others unsurprisingly, used it as “proof” that upzoning doesn’t help with affordability. CityLab published a very thoughtful rebuttal to this claim from City Observatory contributor Alex Baca and Cleveland State scholar Hannah Lebovitz.  A key point:  allowable zoning is just one step in a very complex and political process, one that in Chicago is regularly trumped by Aldermanic privilege, and subject to considerable negotiation and often political veto.  To those who would use the study as ammo to argue against up-zoning, Baca and Lebovitz say:

 . . . the paper shouldn’t be reduced to a “checkmate, YIMBYs” declarative. No one who is intimately engaged with the complexities of affordable housing in America would suggest that zoning is the sole knob to twiddle to increase affordability—and Freemark doesn’t, either. Zoning is targeted because its origins are inherently racist, bigoted, and exclusionary. But, again, it is not the sole input to making housing more affordable. It’s just the one that, by changing it, allows for many other things that make housing more affordable. . . . But, for now, these findings are inconclusive and in many ways detached from the day-to-day reality of how local-level zoning and planning work. We hope they are not used to validate a continuation of exclusionary practices, or misguided power moves by elected officials in American cities and their suburbs.

Our takeaway:  upzoning is necessary, but not sufficient, to getting more affordability.

2. How segregation is behind Chicago’s black population exodus.  Writing in the Chicago Reader, Pete Saunders explores the reasons behind the steady decline of the city’s black population. What’s particularly striking about Chicago, is that while regional population has been stagnant to slow growing, that’s masked some big changes, particularly within the City of Chicago itself.  Saunders explains:
After a big drop in the first half of the last decade, the number of white residents in Chicago has grown 9 percent since 2005. Latino growth has slowed significantly, but it’s still up about 5 percent since 2000. Chicago’s Asian population has boomed, growing by 44 percent since 2000.                        
But Chicago’s black population, the city’s largest demographic in 2000, has dropped by 24 percent through 2017, going from more than one million in 2000 to just under 800,000 in 2017. The number of whites in Chicago surpassed blacks in 2017, and Latinos will almost certainly pass blacks by the time of the 2020 census.
Many reasons have been offered for this—Chicago’s economy has been somewhat less robust that that of coastal metropolises. But Saunders thinks the black population decline is attributable to the city’s history and continued pattern of racial segregation. Again, Saunders:
Segregation has created a lack of economic mobility. I’d argue that Chicago is economically stratified to the extent that upward mobility for blacks here is particularly difficult.  . . .  Networks are hard to penetrate. The power structure is rigid. There’s also a lack of residential mobility.
A perceived lack of opportunity, and the persistence of segregation and neighborhood stigma that follows them even to suburbs prompts many black Chicagoans to seek opportunity in other metropolitan areas.

3. Reforming rent control in New York. New York City’s rent control and rent stabilization laws are up for renewal this year, and the Citizen Budget Commission, a local good government group has some thoughts on what the Legislature might consider doing to improve the program. Chief among their suggestions: re-thinking rent stabilization for higher income households. The CBC estimates that more than 28,000 households with annual incomes of $200,000 or more live in rent-stabilized apartments. Collectively rent stabilization saves them about $271 million annually, compared to what they would be expected to pay in the un-regulated market. Phasing out rent stabilization for these households would increase city tax revenues and provide additional resources to support programs for lower income households.

New Knowledge

Technology is making mortgage lending more efficient.  A recent study from the Stern Business School and New York University looks at the performance gains from increasingly computerized process of mortgage loans, one of the effects of “fintech” or financial technology.  Fintech lenders include Quicken Loans, Loan Depot and other companies (Editor’s note: Quicken Loans is a supporting sponsor of City Observatory). These technology based lenders have increased their market share since the great recession, and now account for about 8 percent of US mortgage lending. The study shows that these tech lenders process mortgage applications about 10 days faster than conventional lenders, even after controlling for variations in loan, borrower and territorial factors.  Importantly, the study shows that default rates aren’t higher for these loans, suggesting that the technologically enhanced underwriting is equally prudential in addressing risk.

FinTech mortgage borrowers are attracted to the faster processing times and greater convenience involved with online applications and partial automation of mortgage underwriting. This is consistent with the faster growth of FinTech in census tracts with previously long mortgage processing cycle times and the higher incomes and education of FinTech borrowers. It is also consistent with the high share of refinances for FinTech lenders. We find no empirical support for the hypothesis that FinTech lenders have grown by disproportionately targeting risky, marginal borrowers. Despite the emphasis of the FinTech lending model on online applications and interactions, we also find no evidence that younger borrowers or borrowers located in census tracts with better Internet access are more likely to borrow from FinTech lenders

Andreas Fuster, Matthew Plosser, Philippe Schnabl & James Vickery, The Role of Technology in Mortgage Lending, February 13, 2018

In the News

Our commentary on ten things more inequitable than road pricing made it to the pages of the Greenwich (CT) Free Press.

The Week Observed, February 22, 2019

What City Observatory did this week

It’s time to get serious about climate change.  We published a guest commentary from City Observatory friend Ethan Seltzer, who takes a critical look at the largely rhetorical approach that the Portland region is taking to the increasing serious menace that is climate change. Globally, the International Panel on Climate Change is warning that time is drawing short to make meaningful progress in reducing carbon emissions. Locally, Oregon’s own Greenhouse Gas Commission reports that the state is losing ground in its efforts to reduce GHGs, almost entirely because of an increase in driving in the past few years. As Seltzer writes, what Oregon needs is not a just a low carbon future, but a low car future.

Must read

1. An affordable housing crisis for whomst?  Allan Mallach, author of one of our favorite books of 2018, The Divided City, has a new article at Shelterforce exploring the variations in housing affordability across US metro areas. He points out the affordability per se is not the problem in many declining rustbelt neighborhoods:  there, the key problem is that low income families don’t have enough income to afford housing. Even though housing values are low in these markets, Mallach explains why private landlords find it difficult to charge rents affordable to the lowest income families:

While house sale prices will keep going down nearly to zero until they reach their market level—if there is one—rentals work differently. Landlords have to factor in how much they need for maintenance, reserves, repairs, taxes, and some combination of mortgage payments and an acceptable return on the value of their equity. Landlords also factor in their expectations. If they think their property is appreciating, they’ll accept a lower annual rate of return on equity, because they figure they’ll make up for it when they sell the property. But if they think it’s losing value, they’ll look for a higher annual rate of return to make up for the fact that they may not get their money back if they try to sell it down the road. The lower their expectations are, the more they will try to increase their net cash flow, cutting back on maintenance, and even not paying property taxes. So even a landlord who owns a house worth $0 on the market may still need to rent it for $700 to cover their costs and get a sufficient return.

If a landlord can’t get the minimum rent they feel they need to make ends meet, they are not likely to lower the rent below that level, which would mean knowingly losing money. Instead, they’re more likely to walk away.

In these markets, building more subsidized affordable housing, through a combination of Low Income Housing Tax Credits (LIHTC), and Section 8 Housing Choice Vouchers may help some families get into better housing, but can easily lead to the abandonment of other market-provided housing.

Moreover, most LIHTC projects are built in high-poverty neighborhoods, areas where sites are available and more CDCs are active, but where total housing demand is not growing. As a result, those projects often cannibalize the existing housing stock; in other words, as new LIHTC units come on line, most of their tenants come from existing rental housing in the same (or similar) neighborhoods, often bringing Housing Choice Vouchers with them. They move out of private market units, or older LIHTC projects, in to areas that already have a large surplus of housing, putting even more units at risk of abandonment.

2. Housing and transport from a Japanese perspective. Japan has some of the world’s highest housing density, wide-ranging inter-city rail systems and urban transit, and surprisingly affordable housing.  How do they do it? In an essay at Medium, Brendan Hare explores some of the key aspects of the Japanese system. For one thing, land use regulation is very different: in areas designated for residential development, it’s generally just as easy to build apartments as single family homes. One practical result: Japan’s private railway companies have managed to finance their expansion, largely without public subsidy, by developing high density housing near train stations. And then there are web of other complementary policies:  as has widely been reported, you can’t legally register a car in Japan unless you can show that you’ve got an off-street parking space in which to store it.  Think of it as a “housing requirement for cars” rather than a “parking requirement for houses,” which is what we do in this country. It’s an informative look at how different institutional arrangements can produce much different and in many respects better outcomes than we experience in the US.

Tokyo: A rare Godzilla-free day. (Source: MST3K)

3. Speed kills, and highway engineers are all about speed. In her article, “How a singular focus on speed leads state DOTs to overspend and overbuild,” Smart Growth America’s Beth Osborne has a close look at the fundamental contradiction between the primary motivation of almost every highway engineering decision–making cars move faster–and virtually everything else we want our transportation system to accomplish.  Whether its promoting safety, making urban environments more desirable, better connecting land uses, facilitating walking and cycling, encouraging transit or reducing energy consumption, giving priority to faster car trips generally makes things worse.  Osborne shows that the focus on speed also drives state highway departments to choose expensive, inefficient projects that do little to make the transportation system work well for everyone. As she says, “It’s nearly impossible to square the priority of speed with most other state goals”

And we’ll add:  the constant struggle to square the deeply ingrained philosophy of “faster, faster, faster” with conflicting policy objectives produces a rich array of convoluted public relations gimmicks, as state DOTs pay hard cash for faster roads, but mostly pay only lip service to other policy objectives.

New Knowledge

Effects of Age-based property tax exemptions.  One of the favored classes, when it comes to property taxation, are older households. The stereotype is that older households are often poor and on fixed incomes and should be insulated from the burden of property tax payments, or tax increases or both.  The difficulty with the stereotype, is that while some elderly households are poor, older people generally, and older homeowners as a group are now noticeably wealthier than other households.  A new paper from the National Bureau of Economic Research examines the impacts of age-based property tax exemptions on homeownership behavior.  It finds that the tax exemption is associated with higher rates of housing consumption by elderly households.  Over time, older homeowners have gone from owning houses that were on average smaller, and lower valued than younger homeowners, to owning houses that are larger and more valuable than younger homeowners. The analysis suggests that the presence of exemptions leads older households to own homes longer–which is part of the intended policy effect.  But with more older homeowners, fewer homes come on the market to be purchased by younger homeowners. In applying their modeled findings to Cobb County, Georgia, just outside Atlanta, the author’s estimate that the tax exemption led to an addition 7,600 to 11,700 senior households continuing to own homes in the County, with correspondingly lower numbers of younger households and fewer seniors renting their homes.

H. Spencer Banzhaf Ryan Mickey Carlianne E. Patrick, Age-based Property Tax Exemptions,  Working Paper 25468

In the News

The Seattle Times featured Joe Cortright’s analysis of flaws in the Inrix congestion rankings.

The Portland Oregonian cited Joe Cortright’s critique of the Inrix data in its story on city congestion rankings.

Cortright said the report’s economic impact data points are “fiction.” “There’s no feasible set of investments that would let everyone drive as fast at rush hour as at 2 a.m., and the cost of building that much capacity would be far more than the supposed ‘cost’ of congestion.”

The market cap of cities, 2019

What are cities worth? More than big private companies, as it turns out: The value of housing in the nation’s 50 largest metropolitan areas ($25.7 trillion) is more than double the value of the stock of the nation’s 50 largest publicly listed corporations ($11 trillion).

Market capitalization is a financial analysis term used to describe the current estimated total value of a private company based on its share price. It’s a good rough measure of what a company is worth, at least in the eyes of the market and investors. The market capitalization—or “market cap,” as it is commonly called—is computed as the current share price of a corporation multiplied by the total number of shares of stock outstanding. In theory, if you were to purchase every share of the company’s stock at today’s market price, you would own the entire company.

The following chart compares the market cap of the nation’s 50 largest publicly traded corporations (on the right) with the market cap housing in each if the nation’s 50 largest metropolitan areas (on the left).  The magnitude of these numbers is a bit staggering, all values are expressed in billions. The data for housing are broken into two components, the value of single family homes (blue) and multi-family homes (orange).  Sources and methodology for these estimates are described below.

The most valuable company is Amazon, with a market cap of $829 billion; the most valuable metro area is New York, where the market value of owner-occupied and rental housing is over $3.8 trillion—about four times higher. The current market value of Amazon is about the same as the current market value of housing in Seattle or San Jose, the eighth and ninth most valuable housing markets on our list.

Some modest-sized metros have housing that’s worth as much as the entire value of some very well-known corporations: IBM’s market cap ($113 billion) is about equal to New Orleans housing ($120 billion). Orlando’s housing ($255 billion) is valued at more than 50 percent over all of Disney ($166 billion). Two Seattle-based companies (Microsoft, at $827 billion; Amazon, at $829 billion) are each worth more  than all the housing in Seattle (about $776 billion).

The differences are smaller at the bottom end of our two league tables. The fiftieth largest firm, the United Technologies, is worth about 25 percent more more than the fiftieth most valuable metro housing market, Buffalo: $98 billion versus $80 billion.

Buffalo! Credit: Zen Skillicorn, Flickr
Buffalo! Credit: Zen Skillicorn, Flickr

 

It may seem strange to compare the market value of houses with companies, but this exercise tells us more than you might think. Just as the share price of a corporation reflects an investor’s expectations about the current health and future prospects of a company, the price of housing in a metropolitan area also reflects consumer and homeowner attitudes about the quality of life and economic prospects of that metropolitan area. So, for example, as the price of oil has fallen, weakening growth prospects in the oil patch, it’s quickly translated into less demand and weaker pricing for homes in Houston. Just as stock market investors purchase and value stocks based on the expectation of income (dividends) and capital gains from their ultimate sale, so too do homeowners (and landlords)—they count on the value of housing services provided by their home as well as possible future capital gains should it appreciate.

In fact, these two commodities—housing and stocks—are among the most commonly held sources of wealth in the United States. And while the financial characteristics of the two investments are dramatically different the underlying principle is the same, making market cap is a useful common denominator for assessing the approximate economic importance of each entity.

Each day, the financial press reports the market’s assessment of the value of individual firms, through their stock prices. But viewed through a similar lens, the housing markets of the nation’s cities are by this financial yardstick an even bigger component of the nation’s economy.

Technical Notes

Our estimates are based on the market capitalization of publicly traded U.S. based corporations as reported on January 19, 2019.  Our estimates of the value of single family housing in each metropolitan market were generously provided by real estate experts at Zillow.  For more keen insights on housing markets, follow their work at Zillow’s Real Estate and Rental Trends blog.

We supplemented Zillow’s estimates of the value of the single family housing stock by computing the market value of the nation’s multi-family housing using data from the Census Bureau’s American Community Survey. In real estate, the value of rental housing is usually estimated using a “cap rate” or capitalization rate, that approximates the rate of return on capital that real estate investors expect from leasing out apartments. To estimate the current market value of apartments, we multiplied the median rent in each metropolitan area by the number of multi-family housing units in that area.  Then we deducted  35 percent to estimate “net operating income”—the amount the investor receives after paying maintenance, other operating expenses, and taxes—and then we divide this number by a capitalization (cap) rate of 6 percent. Both of these figures (net operating income and capitalization rates) are rough estimates—values vary across different times of properties, different markets, and over time with financial conditions (such as with the change in market interest rates). Our estimates of the value of the housing stock in each metropolitan area differ from those we published in 2016.

Measuring “anti-social” capital

The number of security guards is a good measure of a city’s level of “anti-social” capital

In his book Bowling Alone, Robert Putnam popularized the term “social capital.” Putnam also developed a clever series of statistics for measuring social capital. He looked at survey data about interpersonal trust (can most people be trusted?) as well as behavioral data (do people regularly visit neighbors, attend public meetings, belong to civic organizations?). Putnam’s measures try to capture the extent to which social interaction is underpinned by widely shared norms of openness and reciprocity.

It seems logical to assume that there are some characteristics of place which signify the absence of social capital. One of these is the amount of effort that people spend to protect their lives and property. In a trusting utopia, we might give little thought to locking our doors or thinking about a “safe” route to travel. In a more troubled community, we have to devote more of our time, energy, and work to looking over our shoulders and protecting what we have.

The presence of security guards in a place is arguably a good indicator of this “negative social capital.” Guards are needed because a place otherwise lacks the norms of reciprocity that are needed to assure good order and behavior. The steady increase in the number of security guards and the number of places (apartments, dormitories, public buildings) to which access is secured by guards indicates the absence of trust.

The number of security guards in the United States has increased from about 600,000 in 1980 to more than 1,000,000 in 2000 (Strom et al., 2010). These figures represent a steep increase from earlier years. In 1960, there were only about 250,000 guards, watchmen and doormen, according to the Census (which used a different occupational classification scheme than is used today). The Bureau of Labor Statistics reports that the number of US security guards has increased by almost 100,000 since 2010, to a total of more than 1.1 million. As a measure of how paranoid and unwelcoming we are as a nation, security guards outnumber receptionists by more than 100,000 workers nationally.

Sam Bowles and Arjun Jayadev argue that we have become “one nation under guard” and say that the growth of guard labor is symptomatic of growing inequality. The U.S. has the dubious distinction of employing a larger share of its workers as guards than other industrialized nations and there seems to be a correlation between national income inequality and guard labor.

Just as the U.S. has a higher fraction of security guards than other nations, some cities have more security guards than others. To understand these patterns, we’ve compiled Bureau of Labor Statistics data from the Occupational Employment Survey on private security guards. BLS defines security guards as persons who guard, patrol, or monitor premises to prevent theft, violence, or infractions of rules, and whom may operate x-ray and metal detector equipment. (The definition excludes TSA airport security workers).

This occupational data reports the number of security guards in every large metropolitan area in the country. Adjusting these counts by the size of the workforce in each metro area tells us which places have proportionately the most security guards–which are arguably the least trusting–and which places have the fewest security guards, which may tend to indicate higher levels of social trust. We rank metropolitan areas by the BLS estimates of the number of security guards per 1,000 workers.  For particularly large metro areas, we report BLS estimates for the largest metropolitan division in a metro area.)

Security Guards per 1,000 Workers, 2017

At the top of the list is Las Vegas. While the typical large metro area has about 8 security guards per 1,000 workers, Las Vegas has 19 per 1,000.  Miami ranks second, with more than  twice as many (18 per 1000) as the average large metro. Other cities with high ratios of security guards to population are Memphis, New Orleans, Miami and Baltimore. Washington D.C., with its high concentration of government offices, defense and intelligence agencies, and federal contractors, also has a high proportion of security guards.

At the other end of the spectrum are a number of cities in which the ratio of security guards to workforce is one-third lower than in the typical metro area. At the bottom of the list are Minneapolis-St. Paul, Grand Rapids and Portland, all with fewer than six security guards per 1,000 workers. (The Twin Cities and Portland also do well on most of Putnam’s measures of social capital)

It seems somewhat paradoxical, but the salaries paid to security guards get treated as a net contribution to gross domestic product. Yet, in many important senses, security guards don’t add to the overall value of goods and services so much as they serve to keep the ownership of those goods and services from being rearranged. As Nobel prize winning economist Douglass North has argued, we ought to view the cost of enforcing property rights as a “transaction cost.” In that sense, cities that require lots of guards to assure that property isn’t stolen or damaged and that residents, workers, or customers aren’t victimized, actually have higher costs of living and doing business than other places. These limits on easy interaction may stifle some of the key advantages to being in cities.