Inequality in three charts: Piketty, the picket fence and Branko’s elephant

Rising inequality in the US isn’t new; Declining inequality globally is.

Scratch just beneath the surface of many daily problems, and you’ll find income inequality is a contributing factor, if not the chief culprit.  Whether its concentrated poverty, soaring housing costs, disparities in educational attainment and public services, or the nation’s political divide, it all seems related to growing inequality.

US Inequality: The picture is getting sharper, but the trend has been evident for 25 years

A new chart published earlier this month in the New York Times brings the magnitude of the inequality problem into sharp focus. Based on work by Thomas Piketty and his colleagues, it shows how much incomes have changed at every point in the income distribution. As the chart makes plain, income gains in the US have been highly concentrated in the top 1 percent of the population (and within that group, within the top 0.001 percent). This chart is from an excellent anlaysis published by Vox which explains Piketty’s research in more detail).  What the chart shows is that for those in the bottom quintile of the income distribution (under 20 percent), gains in real income, prior to taxes, were negative between 1980 and 2014; only the net effect of tax changes got their income to zero change.

But while sharply detailed and shocking, this isn’t really new.  A quarter of a century ago (1992), then relatively obscure though outspoken academic economist Paul Krugman wrote an article (“The rich, the right, and the facts”) for The American Prospect contrasting what he called the “the picket fence and the staircase.” Though much cruder that the measures generated by Piketty, Krugman’s computations show the same trend. While during the growth period of the 1945 to 1973 period  the growth in US incomes resembled a picket fence (each quintile of income seeing roughly the same percentage increase), after 1980, the growth in incomes resembled a steep staircase (with income growing faster for those with the highest incomes).

Paul Krugman’s Picket Fence & Staircase

Its easy to imagine that the same processes are at work globally.  Its still the case that global income is highly unequally distributed, and those in Europe and North America have a disproportionate share of world income. And while still highly unequal, the global distribution of income has become much less skewed in the past couple of decades. And to contrast with Piketty’s curve and Krugman’s picket fence and staircase, we have Branko’s elephant.  Drawn by economist Branko Milanovic, this chart depicts the global distribution of income in exactly the way that the other charts depict US income gains.  The horizontal axis corresponds to average income and the vertical axis measures the percentage increase in income between 1988 and 2008.

The elephant shape of the curve shows low gains among the very poorest of the global poor (at the far left), sizable gains for those in the 10th through 70th percentiles of the global distribution of income, and a sharp falling off of gains for those in the 75th through 90th percentiles–with actual declines in income for some near the 80th percentile.  The elephant’s soaring trunk (C) is the income gains of the the top 1% and 5% of the global distribution.

The global richest are getting richer, but so are the bottom two-thirds of the world’s population

Branko’s elephant contrasts the startling improvements in measured income, led by growth in Asia, and particularly the development of India and China. Those in the middle of the income distribution actually recorded the fastest income gains. The residents of these countries have seen their real incomes increase substantially in the past two decades. And as Milanovic points out, this development has produced the first decline in global income inequality since the industrial revolution began. The gains at the very top (the tip of the trunk) reflect the superstar rich, throughout the world. And the fall off in the 85th percentile (the drooping bottom  of the trunk at B)  reflects the plight of those in advanced economies who haven’t benefited from globalization and technological change. These include the bottom half of the income distribution in many “developed” economies, and many modestly educated workers in the US.

So when it comes to inequality, there’s bad news and good news here. The concentration of income among the top 1 percent is real and growing, both in the US and globally. But its also the case that a huge fraction of the world’s population enjoys a higher income level today than two decades ago, and as a result, the global distribution of income is less skewed (at least for the bottom 95 percent of the population). And its important to note that the inequality problem isn’t playing out the same way everywhere: some nations have experienced much less of an increase in inequality than the US. Differences in national policies and institutions, play a critical role in determining income distributions.


Driven Apart: How sprawl is lengthening our commutes

The secret to reducing the amount of time Americans spend in peak hour traffic has more to do with how we build our cities than how we build our roads.

Our 2010 report, published by CEOs for Cities, looks at how land use patterns determine travel distances. Commonly used measures of traffic congestion, like the travel time index conceal the role of sprawl in adding to travel times.  Cities with more compact development patterns have shorter travel times, a fact not reveal by congestion statistics.

While peak hour travel is a perennial headache for many Americans – peak hour travel times average 200 hours a year in large metropolitan areas – some cities have managed to achieve shorter travel distances and actually reduce peak hour travel times. The key is land use patterns and transportation systems that enable their residents to take shorter trips and minimize the burden of peak hour travel.

That’s not the conclusion promoted by years of highway-oriented transportation research. The Urban Mobility Report produced annually by the Texas Transportation Institute and widely used to gauge metropolitan traffic problems has completely overlooked the role that variations in travel distances play in driving urban transportation problems.

This report offers a new view of urban transportation performance. It explores the key role that land use and variations in travel distances play in determining how long Americans spend in peak hour travel. It shows how the key tool contained in the Urban Mobility Report – the Travel Time Index – actually penalizes cities that have shorter travel distances and conceals the additional burden caused by longer trips in sprawling metropolitan areas. Finally, it critically examines the reliability and usefulness of the methodology used in the Urban Mobility Report (UMR), finding it does not accurately estimate travel speeds, it exaggerates travel delays, and it overestimates the fuel consumption associated with urban travel. How we measure transportation systems matters, and the nation needs a better set of measures than it has today.

Why Travel Time Indexes Mislead

To show how the travel time index conceals the real burden of peak hour travel, we compare two cities, Chicago and Charlotte.  According to the travel time index (TTI), congestion is much worse in Chicago, which has a TTI of 1.43, compared to Charlotte’s TTI of 1.25.  Chicago’s TTI means a peak hour trip takes 43 percent longer than the same trip taken in off-peak hours.  But what the TTI hides is the fact that average trips in Charlotte are much longer (19 miles) than in Chicago (13.5 miles).  Chicago commuters face basically the same amount of delay (about 9 minutes), but their total travel time (32 minutes) is much less than in Charlotte (48 minutes) because the region is so much more compact.

A Comparison of Charlotte and Chicago

Land use patterns and trip distances are the real source of travel burden

Travelers in some cities – those with more compact development patterns – tend to spend less time in peak hour traffic because they don’t have to travel as far.

If every one of the top 50 metro areas achieved the same level of peak hour travel distances as the best performing cities, their residents would drive about 40 billion fewer miles per year and use two billion fewer gallons of fuel, at a savings of $31 billion annually.

In the best performing cities the typical traveler spends 40 fewer hours per year in peak hour travel than the average American because of the shorter distances they have to travel.

In the best performing cities – those that have achieved the shortest peak hour travel distances – such as Chicago, Portland and Sacramento, the typical traveler spends 40 fewer hours per year in peak hour travel than the average American. In contrast, in the most sprawling metropolitan areas, such as Nashville, Indianapolis and Raleigh, the average resident spends as much as 240 hours per year in peak period travel because travel distances are so much greater. These data suggest that reducing average trip lengths is a key to reducing the burden of peak period travel. Over the past two decades, for example, Portland Oregon, which has smart land use planning and has invested in alternative transportation, has seen its average trip lengths decline by 20 percent.



Why we’re talking about Portland’s freeway widening proposal

Portland is a bellwether for transportation policy; is it going to take a giant step backward?

Last month, the Oregon Legislature passed a $5.3 billion transportation funding bill. A central piece of this legislation is advancing three projects that would widen Portland area highways. HB 2017A makes initial allocations of funding to start (but not necessarily enough funding to actually complete) the addition of lanes on Interstate 5 near downtown Portland, to Interstate 205 in the region’s Southern suburbs, and State Highway 217 in the western suburbs. Each of these projects is being sold as “bottleneck” buster, and assumes that just a little bit for freeway capacity will somehow magically resolve daily traffic congestion. After decades of progressive leadership in transportation policy, “Portland freeway widening” has a certain “man-bites-dog” quality as a story, but we think there’s something more here.

Regular readers of City Observatory might well ask why we’re spending so much time talking about a proposal to spend upwards of a billion dollars widening three Portland-area freeways. And it’s perfectly fair for them to think that we’re being a bit parochial focusing on these projects (City Observatory is based in Portland). So it’s true that we have more than an academic interest in the proposed projects.

But we think something more is at stake, and that Portland represents a kind of bellwether for moving US transportation policy forward.

Back in the 1970s, Portland and Oregon were national leaders in a broad range of environmental policies, from cleaning up badly polluted local rivers, to making sure the state’s beaches remained in public ownership, to implementing the nation’s first beverage container deposit law, to requiring statewide land use planing including drawing urban growth boundaries around each of the cities in the state.  In many ways the signature items in this environmental litany were the decisions to demolish one freeway (downtown Portland’s Harbor Drive) and to cancel another (the Mt. Hood Freeway). These decisions, and a range of supporting policies that unfolded in their wake (building a regional light rail system, greatly expanding biking, bringing back the streetcar) made Portland an important leader in using a different approach to transportation to build a great urban place.

Days of future, past? (Portland’s Harbor Drive Freeway, circa 1962–Now Waterfront Park).

Looking back, what is it we knew 40 years ago, that we no longer know today?  At the dawn of the environmental movement, Portland was willing to take bold steps that not only challenged the conventional wisdom, but pushed the boundaries of policy with a steady stream of civic innovation.

We now know that climate change disruption is ongoing, and unless we take immediate and decisive action to reduce carbon emissions, we are likely to permanently damage the planet.

Looking forward, with the days of the internal combustion engine numbered, with autonomous vehicles moving steadily closer to market, why are we intent on putting a huge chunk of scarce public resources into a minor, out-dated and ineffective part of the transportation system?

The proponents of the project are trotting out an impressive array of myths.

  • They argue additional capacity is needed to reduce congestion. It’s now settled fact that additional un-priced roadway capacity in urban areas simply generates more traffic, congestion and pollution.
  • They argue that idling in traffic contributes to greenhouse gas emissions. Studies by engineers at Portland State University show this is a myth, and that highway expansions lead to more greenhouse gases.
  • They play the safety card, talking about crashes. But freeways are five times safer than city streets according to the region’s transportation planning agency, Metro. And congested freeways actually have fewer serious crashes than fast-moving ones.

And this project comes forward at exactly the time the Oregon Legislature is being told by its official Global Warming Commission that the state is going to fail to meet its legally adopted goal for reducing greenhouse gas emissions because of an increase in driving in the state.

There’s one glimmer of hope on the horizon. As part of its newly passed transportation legislation, Oregon has directed its Department of Transportation to seek federal permission to toll the urban portions on Interstate 5 and Interstate 205. Peak hour real-time tolling of freeways could reduce congestion and greenhouse gases without expanding capacity at all. If it moves ahead with a comprehensive, state-of-the-art tolling system for freeways, Portland could again put itself in the ranks of innovative cities when it comes to transportation.

So in the months ahead, dear readers, we hope you’ll bear with us as we turn a careful eye to Portland’s freeway widening proposals. We think there’s a lot going on here that’s of more than local interest.

If you want to get a flavor for how this issue is being discussed locally, here’s City Observatory’s Joe Cortright discussing the proposal to widen Portland area freeways with State Senator Lee Beyer, the author of the legislation, on Oregon Public Broadcasting’s “Think Out Loud.”


What Dallas, Houston, Louisville & Rochester can teach us about widening freeways: Don’t!

Portland is thinking about widening freeways; other cities show that doesn’t work

Once upon a time, Portland held itself out as a national example of how to build cities that didn’t revolve (so much) around the private automobile. Back in the 1960s and 1970s, it recognized that building more freeways just generated more traffic, and it tore out one downtown freeway, and cancelled another, and instead took the bold step of investing in transit and encouraging greater urban density.

But now the region is confronted with proposals to spend upwards of a billion dollars on three freeway widening projects. The idea that widening freeways will reduce congestion has been thoroughly debunked. Economists now talk about the “Fundamental Law of Road Congestion“–each incremental increase in highway capacity generates a proportionate increase in traffic, with the effect that congestion quickly rebounds to previous levels–accompanied by more sprawl, longer trips and increased pollution. As it contemplates spending upwards of a billion dollars on three proposed freeway-widening projects, Portland might want to spend a little time looking at what’s been learned in other cities around the country.  The experiences of four cities confirm the lessons that Portland thought it learned four decades ago.


Add as many lanes as you like, you’ll just get more traffic and congestion

Adding lanes doesn’t end congestion. (Houston Chronicle)

America’s largest freeway is Houston’s 23-lane Katy Freeway. Its been widened many times, always, ostensibly with the idea of eliminating congestion. But no matter how wide it gets, added capacity just induces further flung development and more peak hour driving, with the result that the freeway is even slower today than it was when it was widened just a few years ago. Texas spent $2.3 billion to widen the road, but just 3 years after in opened, the morning commute has increased by 25 minutes (or 30 percent) and the afternoon commute has increased by 23 minutes (or 55 percent). It’s just one of many examples of how expanding freeway capacity to fight congestion is simply futile.


Even in the Lone Star State, they’re willing to cancel big road projects

In Dallas: A park instead of a highway.

For decades, Portland has prided itself on its 1970s era decision to tear out one freeway (Harbor Drive) and to forego building another one (the Mount Hood Freeway). Meanwhile, in much of the Sunbelt, cities like Houston built more and wider freeways. But even in Texas, the tide is turning. Just this month, the City of Dallas junked decades old plans to build a six-lane tollway to relieve downtown traffic congestion. Called the Trinity Parkway, the billion dollar road would have been built in the floodway of the long-neglected Trinity River that flows in and near downtown Dallas. For years, the project has moved forward with a steady, and familiar refrain:

Supporters of the road have long said it is crucial to relieving daily congestion on the knot of highways surrounding downtown.

But earlier this month, the Dallas City Council voted 13-2 to cancel the tollway.  Instead, the Trinity River floodplain will be developed as a park. Kinda like what Portland did with its waterfront four decades ago.


If you widen first, and toll later, you’ll waste millions or billions

One aspect of Louisville, Kentucky’s transportation system looks a lot like Portland’s. Louisville lies just south of the Ohio River, and every day, tens of thousands of suburban Hoosiers use the interstate freeway to commute to jobs in Louisville, mostly on the I-65 bridge. (In Portland, it’s tens of thousands of Washingtonians crossing the Columbia River, principally on Interstate 5, to jobs in Oregon). Until a couple of years ago, the I-65 river crossing, like I-5, consisted of six travel lanes. Six months ago, Kentucky and Indiana completed a billion dollar freeway widening project that expands I-65 to twelve lanes (by twinning the existing Ohio River bridge). To help pay for the new bridge, the state’s started charging a toll that averages about $2 (with big discounts for regular commuters). The result: despite doubling capacity, the number of people using the I-65 crossing has fallen by almost half. Now the new super-sized river crossing is grossly under-used, even at rush hour.

This is rush hour on I-65 in downtown Louisville, with tolls (and a billion dollars of un-needed freeway).

If Louisville had tolled the river crossing before committing to constructing additional capacity, it would have realized it didn’t need anything like 12 lanes over the Ohio River–the existing bridges would have sufficed.

In Oregon’s case, the Legislature has directed the Oregon Department of Transportation to get federal permission to toll Interstate 5 and a parallel route (I-205). Given Kentucky and Indiana’s experience, it would be wise to implement tolls first, before making any additions to existing freeway capacity. The overwhelming evidence is that tolling could reduce, delay or even eliminate the need for costly freeway widening.


Tearing out a freeway makes a better city.

Going, going . . . (Stantec, via CNU)

Rochester, New York is in the process of removing and filling in a depressed (and depressing) urban freeway it built in the 1960s.  Removing the “Inner Loop” freeway is reconnecting downtown neighborhoods, and helping stimulate development.  The city has just approved a new mixed use development on former freeway land that includes 120 units of housing. More housing and fewer roads are the cornerstones of revitalizing the city’s downtown, according to the Congress for the New Urbanism.

Lessons learned?

Looking at the experience of other cities should tell Portland’s leaders that freeway widening projects like the proposed Rose Quarter expansion are ineffective, costly, unnecessary, and out of date. Houston’s experience shows that adding more lanes doesn’t reduce congestion, it just induces more traffic. Louisville shows that if you’re going to toll freeways, you can expect a big drop in traffic that will likely obviate any perceived need for more lanes. And Dallas shows, than even in traditional auto-dominated cities, its possible to simply walk away from out-dated freeway expansions plans. For those who are really serious about reclaiming valuable urban space for people, it makes sense to tear out freeways, as Rochester is currently doing, rather than building more. Portland was once a leader in re-thinking how to reduce auto-dependence; today, there are valuable lessons it can learn from other cities.

Uber’s Movement: A peek at ride-hailing data

Uber’s lifting the veil–just a little–to provide data on urban transportation performance

Uber’s new Movement tool provides valuable new source of data about travel times in urban environments. We’ve gotten an early look at Movement, and think its something that you’ll want to investigate, if you’re interested in urban transportation.

Uber likes to bill itself as a technology company, rather than a transportation company: technically, it’s the independent driver-owners of vehicles that provide the transportation service; Uber uses an array of information technology to arrange, monitor, finance, and evaluate the transaction.  In the process, Uber generates a huge amount of data about the trips that people take and the level and speed of traffic in cities. Access to this ride data has been hotly debated for a number of reasons. Customers, rightly, are interested in protecting their privacy. Ride-hailing companies naturally are seeking to keep this valuable market information from their competitors.

Ride-hailing companies have also been reluctant to share this data with public authorities. New York has managed to force disclosure of some information (which has served as the basis of the Bruce Schaller’s report, which shows ride hailing having a material impact on New York travel speeds). San Francisco working with IT experts from Northeastern University, figured out how to scrape information about ride-hailing trips within the city based on the company’s public facing web sites. Now Uber has stepped forward and started making at least some of its data directly available to everyone.

Movement: a portal to Uber’s travel time data

Uber’s made its new Movement data analysis tool open to the public this week.  Initially it’s just providing data for a handful of cities including Boston, Washington, Manila and Singapore, but the company promises to add more cities as time goes by.

The Movement interface is straightforward and simple to use.  Its greatest utility is the ability to easily generate data on actual travel times for a given route over a number of different dates. This kind of simple time-series analysis tool can help identify where travel times are increasing or decreasing compared to some base period.  This can be extremely useful for diagnosing the effect of transportation investments or observing the effects of system disruptions (like the Atlanta Freeway collapse).

An Example:  How has a typical Washington DC commute changed in the past year

Suppose you live in Bethesda Maryland, and commute by car to the Brookings Institution near Dupont Circle in Washington. How has your commute changed in the past year?  We used the Movement tool to select an address in central Bethesda and 1775 Massachusetts Avenue NW as our origin and destination, respectively.  We chose two time periods (the first quarter of 2017 and the first quarter of 2016), and restricted our search to weekdays, and the AM Peak period (from 7 am to 10 am).  The results are shown below:


On average it takes about 31 minutes and 4 seconds to take this morning commute, down almost 3 minutes from the time required in the previous year (33 minutes 57 seconds).  The map’s color coding shows that most commute destinations from Bethesda are shorter trips (shaded green) than they were in the previous year. Helpfully, the interface also shows the range of travel times for trips taken during these periods; this range reflects the geometric standard deviation about the arithmetic mean of the travel time data.  Morning commutes on this route ranged from 23.5 to 41.0 minutes  in the first quarter of 2017, compared to a range of 26.5 to 43.5 minutes in the prior year. So while the mean commute is down nearly three minutes, the range is still broadly the same as it was in the prior year.

There are some important limitations to this data. The Movement interface reveals trip times only for origin-destination pairs that have a sufficient number of trips (undertaken by Uber drivers) to enable them to calculate average trip times. While this is not a problem in the dense, urban environments which are the richest market for ride-hailing companies, data are sparse in lower density areas, and don’t appear at all for some suburb-to-suburb trips. While this is understandable (Uber can’t generate data for trips that no one buys from it), it’s important to keep this in mind when looking at the data. Fortunately, Uber has disclosed the threshold it uses for presenting data for any set of origin-destination pairs: in general, there have to be at least 5 trips between the origin and destination during the time period examined, and for privacy purposes, the trips have to be made by at least 3 different customers. In addition, Uber filters out origin-destination pairs that have fewer than 30 observations in a given month. (And for those concerned about privacy, the origins and destinations of actual Uber trips aren’t disclosed in the Movement interface, just the estimates of how much time a trip would take based on the average of all trips recorded by Uber along these routes.)

As a result of its service patterns and these filtering provisions, Uber’s data has a heavily urban focus.  Their data for the Washington DC area covers the entire area within the Beltway. (Areas shaded blue, green and yellow are reported in Movement; areas shaded gray are not).

It’s also worth remembering that Movement data tell us a lot about traffic speed, but essentially nothing about traffic volumes. Uber vehicles are essentially a sample of vehicles traveling at different times, but Uber lacks data about how many other vehicles are on the road. So essentially, we’ll still have to rely mostly on old-school traffic counting technology for vehicle counts.

Keeping it smart: Transparent and consistent

Going forward, we hope Uber extends its Movement tool to all the major markets it serves. It’s a great example of how “big data” can be made easily available for ordinary citizens, and its a terrific public service for Uber to share this. That said, we have a couple of pieces of advice for Uber.

First, in order to be useful, especially for time series analysis, it has to be consistent. For now, the data in Movement goes back to 2015, but not earlier.  Future data availability hinges in part on the company’s continued existence, but another risk is that methodology changes and “series breaks” may make it difficult to track change accurately over time. Much as we appreciate Uber’s civic mindedness in sharing this data, we’re also aware of how vulnerable this makes us. For several years, Inrix–another major provider of real time travel data,similarly derived from vehicle based GPS measurements–published monthly data on travel times in major US markets. But then abruptly in 2014, the company simply discontinued publication of its city-level data. Since then, the company has produced a series of reports decrying the congestion problem, but not presenting data that was consistent with its early methodologies, making it impossible to independently verify its claims. There’s little doubt that the performance data generated by Uber and other ride-hailing companies will be central to public policy debates about transportation and the impact of ride-hailing; we hope they’ll be willing to provide this data on an ongoing basis in an open format, using consistent methodologies.

Second, and relatedly, the definitions and methodologies used to produce the data need to be as transparent as possible, allowing for appropriate concerns about customer privacy and the competitive value of this data. In our beta testing of Movement, Uber did a terrific job of answering our questions. You can download the data from their website for use in other programs, and as noted above, the site reports the range of observed travel times, as well as averages, so that users can get a sense of the variance in travel times as well. All these details make the data more useful for meaningful analysis.

How do I get access to the data?

Access to Uber’s momentum data is available to anyone with a free Uber account. If you already have an account, navigate to

Editors note: Uber provided City Observatory with the opportunity to be a beta tester of the Movement data and interface. City Observatory was not compensated for this testing.


Is it a net zero home if it has a three-car garage?

Another model energy-saving project ignores density and location

The National Institute of Standards and Technology has built what it calls a model “net-zero” energy home on its Gaithersburg, Maryland campus.  The house is festooned with arrays of solar cells that generate more electricity that the house consumes, and its extensively insulated, air-tight and high-efficiency windows.

This is great stuff. But still, we have a problem. Calling this house “net zero” leaves out a big part of its energy use–transportation.

The home is a 2,700 square foot single family structure with what looks like a three-car garage.  The home is quite a bit larger than the average home–and heating and cooling a large amount of space requires more energy than a smaller space. Also, because it’s a single family building (and one located on a large lot), it doesn’t share common walls with other homes, which also tends to increase heating and cooling costs. And while the building is technically net-zero, because it generates more energy than it consumes, that energy isn’t free: between the added efficiency features and the solar cells, NIST estimates the structure costs about $120,000 more than a standard home.

It’s a bit misleading to call any structure “net zero” without looking at its impact on transportation, because driving is a major source of energy use (and carbon emissions). And as we’ve pointed out in the case of the 1,800 space parking garage for the the National Renewable Energy Laboratory in Golden Colorado, looking at just the structure’s energy budget, without looking at the transportation costs built in to its location, significantly understates its energy consumption, environmental impact and affordability.

Location matters

NIST has built its Net Zero demonstration home on its campus in Gaithersburg, and as an image from Google Maps makes clear, its a classic large lot suburban home.

Not only that, but Gaithersburg (a suburb of Washington DC) is a very automobile dependent city.  According to Walk Score, Gaithersburg’s overall walkability rates 49 on a scale of 100 points making it “car-dependent.”  The Gaithersburg area has some neighborhoods and apartment complexes that are considerably more walkable, but single family houses (like the net zero house) tend to be isolated from common destinations.

Figuring out how to build homes that use less energy is an important way to reduce greenhouse gas emissions. But how big those homes are, and where they’re located is in many ways at least as important as what technologies they use. Perhaps NIST’s next demonstration project can be a zero-net energy apartment building located in an urban neighborhood with a Walk Score of 95 or more, and easy access to transit.

Hundred dollar bills on the municipal sidewalk

The public wealth of cities is substantial, but under-pricing public assets is rampant

There’s an old saw among economists.  Two economists are walking along, and one of them says, “Look, there’s a hundred dollar bill on the sidewalk.” The second economist says, “It can’t be a hundred dollar bill; if it was, somebody would have picked it up by now.”

The point is the economists tend to believe that markets work perfectly, and there are few, if any opportunities to get something for nothing. That old adage came to mind when we read a new story about how municipalities apparently have billions of dollars of hidden assets that they’re just not using.

We were drawn to Governing‘s write-up of a new book called “The Public Wealth of Cities.” It argues that cities everywhere are sitting on huge asset bases, and if they could just monetize the value of those assets they’d have the resources the badly need, to maintain civic infrastructure, provide services and pay down pension debts. Philadelphia’s Jeremy Nowak and Brookings’ Bruce Katz have both chimed in with their endorsement of the concept. And Transportation for America is even holding web conference on recycling public assets, that aims to show how to do this.

The authors, Dag Detter and Stefan Folster of Denmark, argue that many public assets are under-valued. Pre-1980 assets in many cities aren’t valued at all; and many assets are carried on the municipal books at their historic cost, not their current value. If you correctly valued these assets (at their market value or replacement cost) they’d be worth tens or hundreds of billions of dollars. In theory, if you could just earn a very modest rate of return on these assets–as you would from a portfolio of stocks or bonds–they’d generate lots of revenue. Here’s Governing:

If Cleveland put its assets into an urban wealth fund, a modest yield of 3 percent on a fund with a market value in the neighborhood of $30 billion could amount to an income of $900 million a year. That’s nearly double what the city earned in tax revenue in 2014 and is money that could be spent on infrastructure, health care and other critical needs.

Sounds pretty appealing.

But how do you earn a rate of return on a public asset, like a library, convention center or road?  The return has to come from the fees that you charge for its use.  And that’s the problem. For a variety of reasons, some very good and others not so good, we dramatically under-price the use of some assets relative to their market value. Take parks: they occupy huge amounts of very valuable land, but are mostly free to use, and even those parts of park systems that do charge fees typically only partly cover operating costs, and pay nothing in the form of a cash return on the asset. Getting a market rate of return would require charging users not just for operations, but an amount that would return net income to the asset owner.

What’s at fault here is both public sector book-keeping and public sector management. The public sector generally pays little attention to the balance sheet, and to correctly valuing assets. Public sector budgets are overwhelmingly focused on the income statement: does money coming in equal or exceed money going out? Seldom do public sector books accurately account for assets, make allowance for depreciation, or pay an explicit financial return on invested capital. But this isn’t just a book-keeping issue, it also comes down to management. The public sector almost always under-prices the user fees it charges for the use of its assets, in part because the users (citizens) are also the stockholders.

For some things, it makes little sense to charge users a price. In the case of parks, we generally imagine them to be a public good. We’re more than happy to discount their use as way of providing widely shared benefits that are potentially available to all citizens regardless of income. And because many aspects of park use are “non-rivalrous” there’s a good argument to be made that we don’t want to discourage use by charging fees. And with many such public goods, the transaction cost of collecting revenue would be prohibitive.

But in the case of other public assets underpricing, though politically entrenched, is at the root of bad outcomes. Take parking. The portion of the public right of way dedicated to parking is enormous, and quite valuable.  Its value is seldom reflected in the municipal financial reports, and except in a few places where we install parking meters or have neighborhood parking district, we generally give it away for free. Monetizing this asset would require charging users for its value–which in the case of parking would dramatically improve the efficiency of its use. Parking spaces are “rival” goods, and there’s a strong economic argument for asking users to pay their full costs.

As we all know, that’s the rub:  Many people have become accustomed to having roads and parking spaces be “free” and are loathe to pay for them.  If turning some public assets, like parking spaces over to the private sector would facilitate better pricing, it may be a road to realizing the hidden value of infrastructure. But the trick here is not so much public vs. private management, or whether we classify these things as “public wealth,” but whether we charge the correct price. At the end of the day, what’s required is the steely-eyed determination to charge a market price to customers who’ve become accustomed to getting something free or at a very deep discount.

What a congestion report doesn’t tell us about congestion

Congestion is increasing in Portland: But not, apparently, because traffic volumes are increasing

Traffic congestion reports are just as formulaic as bodice-ripping romance novels. They have a predictable narrative form: our region is growing; it has more people and more jobs and more cars. And the number of people and jobs and cars is growing (gasp!) faster than the number of lane miles of freeway and highway in the region.  The inevitable result must be greater congestion–unless we build our way out of the problem.

The Oregon Department of Transportation (ODOT) released a report earlier this week describing traffic trends on Portland area freeways. Like all such reports, its profusely illustrated with a combination of statistics and photographs showing cars, lots of cars, usually with photos taken through an extreme telephoto lens.  The report shows speeds have decreased and peak hour travel times have increased on most freeways over the past two years. It makes a big deal of the fact that the region’s employment has increased by 5.5 percent and the region’s population has increased by 3.0 percent while freeway lane-miles have increased by only 1.0 percent.

Obligatory telephoto lens shots of cars in traffic: it’s a traffic report.

What the report doesn’t tell us is even more important than what it does say.  It does little to explain why congestion seems to be increasing, even though traffic levels are going down in many locations, and actually says very little about what we can do to reduce congestion.

Slower travel times–but less travel.

The usual story about congestion is that our roads are clogged because of an ever increasing volume of cars using them. But in this report, congestion is getting worse (travel times are increasing) even though on nearly half of the region’s freeway segments, travel is decreasing.  Its also worth noting that travel volumes are down even though the economy and population continue to grow.

The report shows that travel times are increasing on most Portland area freeways.  On Interstate 5, the major north-south route through the region, the average number of miles traveled per day has actually decreased about 6 percent from 2,778,000 in 2013 to 2,709,000 in 2015.  At the same time, afternoon peak hour travel speeds on the roadway have fallen from 36 miles per hour to 31 miles per hour in the more heavily traveled northbound direction. For I-5, the report offers no clear explanation why travel times are increasing even as traffic volumes are decreasing. The same pattern holds for I-405, where speeds are down 3 to 7 percent, even though traffic counts are down 10 to 12 percent.  In the case of of US 26 (which reaches to the region’s western suburbs, traffic counts are up in the Eastbound direction, but down in the Westbound direction.  Six of thirteen major freeway segments carried less traffic in 2015 than they did two years earlier–even though the region’s population and employment had increased noticeably.

For nearly two decades, Portland’s population and economy have grown, and per person travel has decreased. Detailed studies conducted by the regional government show that since the 1990s, average vehicle miles traveled per person have declined 20 percent and average trip lengths have decreased about 14 percent. Driving less and taking shorter trips (due to increased density and better mixing of land use types) means people are less exposed to congestion.

A flawed prescription

While packaged as a “performance report” on the region’s highways, this document is really a sales brochure for upcoming ODOT investments to widen three Portland area freeways.  The report’s press release claims:

Approval of HB 2017, the Oregon Legislature’s 2017 transportation package, provides funding for targeted safety and congestion projects that can help address the issues found in the new report.

HB 2017 provides some initial funding for three freeway projects to widen portions of I-5 in Portland, I-205 in the region’s southern suburbs, and Oregon 217 to the West. All three have been sold as “bottleneck busters.”

But when you read the detail in the report, its apparent that while wider roads and electronic signage may reduce crashes, they won’t reduce regular recurring congestion:

Let’s decipher jargon here.  “Auxiliary lane” is engineer speak for widening the freeway.  “Real Time” projects are electronic signs and modifications to ramp meters. “Improving reliability” means decreasing congestion associated with fender benders; and “safety” means fewer fender benders, not fewer injuries or deaths.  And “improve performance” doesn’t mean less congestion, it means–once again, fewer crashes.  There’s also a clear admission that these lane widenings and signals won’t eliminate congestion. So if we translate this into ordinary English, here’s what ODOT is saying:

Our plan is to widen freeways in a few spots and install more electronic signs which we hope will reduce crashes and the congestion associated with those crashes. These won’t reduce regular, daily rush hour congestion that’s not caused by crashes.  There’s actually nothing we can do about routine congestion.

In effect, ODOT is acknowledging what is now called “The Fundamental Law of Road Congestion.”  Increases in roadway capacity in dense urban environments simply induced additional travel, especially at the peak hour, with the result that congestion continues unabated. Spending tens or hundreds of millions to widen freeways in the name of reducing congestion is a futile waste of money.

Even the claim that electronic signs can reduce crashes is questionable.  The sole bit of evidence on the efficacy of the signs comes from a project ODOT implemented on Highway 217 in 2014.

The report also claims that “active transportation management”–a combination of tweaks to ramp metering and electronic traffic information and variable speed signs–added to Highway 217 were associated with a 21 percent reduction in crashes on that roadway. But the evidence for this seems conflicting.

It’s hard to square the headline claim on page 5 about Highway 217 (“TOTAL CRASHES down by 21%”) with the detail contained on the corridor report (page 49). While the Southbound lanes reported a 18 percent reduction in crashes, the Northbound lanes had a 3 percent increase.  If one combines the Northbound and Southbound data, total crashes are down 10 percent from 2013 to 2015, i.e. from 312 crashes in 2013 (121+191), to 281 crashes in 2015 (125+156), a decline of 31 crashes (281-312=31). Moreover, according to the ODOT report, some key data on travel trends in the corridor either isn’t available or is compromised because the corridor was under construction during the reporting period. ODOT’s Traffic Volume Tables data for Highway 217 (which doesn’t distinguish travel directions) reports that average daily traffic (ADT) on the northern segments of Highway 217 declined by about 10 percent between 2013 and 2015 (from 108,700 ADT to 96,400 ADT near the Sunset Highway). Traffic decreased less or increased slightly on segments further south.

If crashes are down about ten percent because traffic volumes are down as much as ten percent on certain segments of Highway 217, that’s hardly a testament to the effectiveness of the information signs.

Time losses are grossly exaggerated

Another key claim of the ODOT report is that highway congestion costs travelers 35,000 hours daily lost time, up from 28,000 hours in lost time in 2013. But, as we’ve shown before, these kinds of estimates are wildly exaggerated. First, its useful to know how ODOT computed “lost time.” According to page 18 of their report, they assumed that the “free flow” speed of Portland’s freeways was 60 miles per hour. They they treated any additional time that people spent traveling at lower speeds as “time lost” due to congestion.  Its worth noting that virtually all Portland area freeways have a 55 mile per hour speed limit, so ODOT is counting as a “cost” the inability of motorists to exceed the legal speed limit on Portland area freeways.

Second, and perhaps more importantly, there’s no feasible set of highway capacity improvement projects that could eliminate peak hour congestion. In effect, the cost of additional road capacity needed to avoid congestion would greatly exceed the value of travel time savings.

Summing up

More congestion is no doubt aggravating, but unfortunately this new ODOT report sheds little light on the problem. Despite the implication that growth is increasing congestion, its actually the case that travel volumes are down on a number of key roadways. Expanding highway capacity won’t reduce regular peak hour congestion, a fact that the report implicitly concedes, and something that has been confirmed by academic studies and real world experience. The evidence that information signs will reduce crashes and associated delay is sketchy. The report exaggerates the time lost to congestion by counting the inability to violate the speed limit as a “cost.”



Dying to widen highways

Oregon’s DOT seems to be more concerned with making cars go faster than saving lives

Yesterday, we took a look at a recent Oregon Department of Transportation (ODOT) “performance report” on Portland area freeways.  One of its main messages–which we found some problems with–deals with congestion.  But the report also seems to devote a lot of attention to another issue:  safety.

On the surface, it sounds like ODOT has gotten the memo on “Vision Zero”–the idea that we should be working to reduce traffic deaths to zero. Their new report uses the word “crash” 199 times and the word “safety” 48 times in a 59-page document. They’re sure working hard to convince us that they’re really interested in reducing crashes and improving safety.

But strangely, there are three words missing from this report: “death,” “fatality” and “injury.”  None of these words appears anywhere in the report.  That’s important because not all crashes are equal. According to Metro’s State of Safety Report, we have lots and lots of crashes in the Portland metro area:  more than 18,000 a year.  But fewer than half of these crashes involve injuries (about 7,500).  About 500 crashes are classified as “serious,” because they involve fatalities or major incapacitating injuries. But for the most part, the crashes that happen on the freeways occur during congested periods, when cars are traveling slowly, and seldom result in other than minor injuries.

If we’re going to achieve Vision Zero, it really means concentrating our energy and resources on the serious crashes, and not the more numerous, but far less serious ones.

The really bad news–which you also won’t find mentioned in the ODOT “performance” report–is the fact that not only are we not making progress on Vision Zero, we’re losing ground in a big way.  Oregon is currently experiencing an epidemic of roadway deaths. Fatalities on Oregon roadways are up 58 percent since 2013. Crashes killed 495 Oregonians in 2016, up from 313 in 2013.

But with the new $5.8 billion transportation bill passed by the Legislature, there’s an opportunity to turn this around. There are some token efforts in that direction. The bill trumpets an allocation of $10 million annually to a statewide safe routes to school program, for example. Surely, ODOT is going to devote the new resources its getting from the increase in the gas tax and vehicle registration fees approved by the Legislature this summer to address this massive safety problem, right?

Not so much.  Widening freeways seems to be a higher priority. The biggest single allocation of funds from HB 2017–the legislature’s new transportation bill–is a down payment on a multi-hundred million dollar project to widen the Interstate 5 freeway near the Rose Quarter.  Two other projects could lead to the widening of Highway 217 in the region’s western suburbs and I-205 to the South, at a total cost estimated at nearly a billion dollars.

Despite the impression conveyed in the freeway performance report, none of these projects addresses an area with a significant concentration of serious crashes.  The report makes it clear that ODOT is primarily interested in crashes not because they kill and maim Oregonians, but because they’re associated with slower freeway traffic. Fender benders often worsen traffic congestion because it takes time to clear away damaged vehicles, which leads to longer delays.

Freeways, especially congested ones, aren’t our big safety problem

The fact is, the region’s freeways are overwhelmingly the safest places to travel. In contrast, arterial streets, particularly those with multiple travel lanes are the most dangerous places. On average, the region’s arterials have five times as many serious crashes per mile traveled as freeways, according to the Metro study, a finding they called “one of the most conclusive relationships in this study.”

Freeways: 5 times safer than arterials. (Source: Metro, State of Safety)

Here’s what the study had to say:

Arterial roadways are the location of the majority of the serious crashes in the region (Figure 2-8). A similar relationship is evident for pedestrians and cyclists, as detailed in Sections 5 and 6. Freeways and their ramps are relatively safe, per mile travelled, compared to arterial and collector roadways (Figure 2-9).

Source: Metro, State of Safety.

And there’s a further irony about the travel data.  Not only are freeways safer than arterials and other roads, they’re actually even safer when they’re congested than when they’re not.  Congestion forces traffic to drive more slowly, which reduces the severity of crashes. When roads are severely congested, the serious crash rate falls precipitously.  Here’s the Metro data: The crash rate for severe congestion (when the volume of traffic exceeds the roadway’s computed capacity) is 0.9 serious crashes per 100 million vehicle miles traveled. This is actually much lower than the 1.73 serious crashes per 100 million vehicle miles traveled in “moderate” congestion, (when the freeway is between 90 and 100 percent of its rated capacity). What this relationship suggests is that reducing congestion from “severe” to “moderate” would actually almost double the number of serious crashes.

Of course, you won’t find this key fact in the ODOT performance report. What you will find, however, built in to its calculations about how much time motorists are “losing” to traffic congestion, is the assumption that free-flow traffic speeds on Portland freeways ought to be 60 miles per hour (five miles higher than the legally posted speed limit on nearly all of the region’s freeways).  The report treats any time a motorist ends up traveling less than sixty miles per hour on Portland area-freeways as the time they “lost” due to traffic congestion. This is clearly an agency that prioritizes speed over saving lives.

Why aren’t we putting our money where the serious crashes are?

As Vancouver’s Brent Toderian is fond of saying, “The truth about a city’s aspirations isn’t found in its vision, it’s in its budget.

Spending money on freeways and freeway widening does almost nothing to address our most serious safety problems and the devastating increase in fatal crashes. We know from the careful analytical work that’s been done by Metro and regional safety officials that highway deaths in the Portland area are overwhelmingly the result of crashes on the multi-lane arterials, streets like Powell Boulevard, Division Street, 82nd Avenue, Barbur Boulevard and others. And these roadways are especially deadly to the most vulnerable road users: pedestrians and cyclists.

Freeways aren’t the most dangerous roads, they’re the safest. And congested freeways have fewer serious crashes than less congested ones. This new performance report makes it clear that the Oregon Department of Transportation is an agency that’s gung-ho to spend money on extremely expensive projects to widen highways, but doesn’t seem to be doing anything to prioritize its investments to the kinds of locations that are killing and injuring increasing numbers of Oregonians. Instead, its chief interest seems to be in wrapping a wasteful and ineffective freeway widening project in the rhetorical mantle of safety as a sales gimmick.

(This post has been updated to correct a formatting error that concealed the first two of the last three paragraphs in the commentary).

Such a deal

How tax policy subsidizes homeownership, mostly for the wealthiest Americans

OK.  Imagine that someone offers you this investment deal.  We want you to buy some stock; in fact, we want you to buy about $150,000 or $200,000 in stock.  In a single company.  Sounds a bit risky, doesn’t it?  But keep listening, we have some special terms that make this a very attractive deal.

We have such a deal for you (unless you are a renter).

First, we’ll help you buy the stock “on margin”–you only need to come up with 10 or 20 percent of the purchase price of the stock.  We’ll lend you 80 or 90 percent of the cost of buying the stock. (And it will be a guaranteed, long term, low interest rate loan, with a fixed rate).  So right away, you’ve got leverage on your investment of five- or ten-fold, meaning that any increase in price of the stock multiplies your profits by a factor of five or ten.

Second, this is a special preferred stock that pays annual dividends of about 5 percent.  So your $200,000 investment pays about $10,000 per year in dividends.  And thanks to a special deal we’ve cut with the government, and unlike any other preferred stock, you get to take these dividends tax-free.

Third, of course, you had to borrow a bunch of money to buy this stock, but good news:  you can deduct the interest you pay on your loan from your taxable income.

Fourth, someday, when you sell your stock, you’ll have a capital gain, but again, we have yet another special deal with the government, so that you can exclude any capital gain you make on this stock from your taxable income.

It’s a great deal isn’t it?

So what is this fantasy stock?  It isn’t a stock, but it isn’t a fantasy either:  It’s owner-occupied housing in the US.

In fact, the higher the income, and the bigger the home you buy, the more you stand to profit from all of these special deals. The value of the mortgage interest deduction, exclusions of imputed rent, and deductibility of state and local taxes are all worth more to you if you are in a higher tax bracket. The mortgage interest deduction only has value if you earn enough to itemize deductions, meaning few low income homeowners can take advantage. It’s estimated that 90 percent of the value of the mortgage interest deduction goes to households with incomes of $100,000 or more. The higher the income and the more you spend on housing, the more your leveraged investment is worth, the more imputed rental income you shelter from taxes, the larger a capital gain you stand to earn.

One of the most important–and usually overlooked provisions of the tax code is the exclusion from taxation on imputed rental income. Here’s how this works.  If you rent your home from someone else, they have to pay income tax on the net income they get from renting. If you rent your home from yourself (i.e. you are a homeowner), you don’t have to pay income tax on that “income.” Your $200,000 home might rent for $1,500 a month, and if you are the owner and live there, you’re effectively getting a that amount of income (net of upkeep), tax free. That makes housing different from virtually every other investment because the dividends from housing (the ability to live rent free in a home you own) doesn’t get counted as taxable income.

You can even buy a second home and get the same deal.

Collectively, the value of these tax breaks for owner occupied housing constitute a subsidy of more than $250 billion per year from the federal government.

The irony here is that there’s almost no evidence that these tax provisions have actually increased homeownership. A new research paper by Jonathan Gruber and two colleagues shows that the deductibility of mortgage interest tends to increase the size of home that people purchase, but doesn’t influence whether they choose to be homeowners. Little wonder we have a nation of huge houses with tons of empty bedrooms.

Oh, and if you’re a renter, you’re probably asking, what kind of deal do we have for you? In short, no deal. Our advice is that you really ought to be a homeowner. But for lower income people, the path to homeownership is difficult and risky. As we’ve explained, low income buyers tend to buy at the wrong time (when the market is near a peak), pay more for credit, buy in more volatile neighborhoods and are more vulnerable in downturns. Its also the case that the the value of all of these tax code subsidies is smallest for low income households.

What we’ve done is to make homeownership a great deal . . . but only if you’ve got enough income to benefit from all the provisions the tax code offers.

The Week Observed, August 11, 2017

What City Observatory did this week

1. How luxury housing becomes affordable. It’s always been the case that developers build new housing for those at the top end of the market. It’s true today, and it was true 50 and 100 years ago. We look back at “luxury” apartments built in Portland in 1910 and in the 1960s to see have they’ve gradually filtered their way down market and are now part of the region’s less expensive housing.

2. Can a net zero house have a three-car garage? We take a close look at the National Institute of Standards and Technology’s Gaithersburg Maryland Net Zero Model home, which, in addition to an array of solar panels, sports three garage doors. What the account for this house leaves out is the substantial energy consumption associated with its suburban location. If a house is located where its occupants need to drive for nearly all daily errands, auto related energy use is going to be much higher than if the same household lived in an apartment in a walkable neighborhood.

3. What a congestion report doesn’t tell us about congestion. We review a new report from Oregon’s Department of Transportation which catalogs increased congestion on Portland area freeways. Surprisingly, congestion has increased on many road segments, even though traffic counts are down. Like other congestion reports, the methodology used greatly exaggerates the time “lost” to congestion: in this case, it treats any time spent traveling at less than sixty miles an hour as time “lost”–even though the speed limit on nearly all Portland area freeways is 55 miles per hour. And importantly, the report concedes that proposed highway widening projects will do little if anything to reduce daily peak hour congestion, except that associated with fender-bender crashes; and even the evidence for that claim is speculative.

4. Dying to widen highways. In part 2 of our review of the recent Oregon Department of Transportation freeway performance report, we call out the “safety-washing” of proposed freeway widening projects. While the report uses the word crash almost 200 times, it omits the salient detail that Portland area freeways are, on average 5 times safer than the region’s arterial streets. Its actually the case that congested freeways, where cars have to slow down, have fewer serious accidents than ones with lower levels of congestion. As with most metro areas, safety won’t be improved by making cars go faster, but by lessening speeds especially where cars conflict with other, more vulnerable road users. Expensive freeway widening projects have little to do with saving lives or reducing serious injuries.

Must read

1. How paying for parking works: Now in easy-to-read comic strip form. You probably haven’t waded through all 900-odd pages of Don Shoup’s magisterial “High Price of Free Parking.” And now you don’t have to: It’s essential lessons have been boiled down to a short comic strip called “The Particular Parable of the Lyft Lot,” published by Bloomberg Business Week.  It tells a story–which we related some months back–about how at it’s San Francisco headquarters, the ride-hailing company Lyft implemented a plan to charge employees who parked in on of the firm’s few parking spots and use the proceeds to subsidize trips taken by those who didn’t use parking. The comic strip shows how this worked, and assured parking for those who needed it, and helped underwrite the cost of alternatives for everyone else.

2. In other parking news, The Seattle Times tells us about one weird trick that makes parking pricing work even better: daily rates. The Gates Foundation has nearly 1,200 employees in downtown Seattle. Rather than charging a monthly price for parking, it charges a daily price (that has a monthly maximum). Employees who don’t use the foundation parking lot get a $3 day payment. If you pay monthly, you are paying for parking whether you use it or not; daily payment gives employees the flexibility to car pool, bus or bike when they can, but assures that on days when its more convenient or worthwhile they can park. As with the Lyft experiment in San Francisco, charging daily rates rather than monthly ones has dramatically reduced daily driving by office workers.


The Week Observed, August 25, 2017

What City Observatory did this week

1. Such a deal.  Suppose someone offered you this investment deal: You can get $200,000 in preferred stock, that pays a 5% annual dividend tax free, and when it comes time to sell this investment, all of your capital gains will be tax free, too. Can’t afford $200K? No problem, put down $20K, and we’ll loan you the balance–you’ve got 10X leverage, so if the price goes up at all, you’ll reap a ten-fold reward on your investment. Sounds to good to be true? It’s real: real estate. These are the key features of the housing finance system for owner-occupied homes in the US, and explain why we have such huge houses, and such widespread homeownership (at least among those with enough income to take advantage of the features of the tax system which give the greatest subsidy to those with the highest income).  And for renters we’ve got . . . nothing (save for the admonition that it would be better if they were high income homeowners).

2. Why freeway widening is still a bad idea in 2017. We look at recent experiences in Houston, Dallas, Louisville and Rochester, all of which are thinking (or rethinking) the desirability of wider freeways. Houston’s Katy Freeway amply demonstrates the fundamental law of road congestion (wider roads induce more driving, and worse congestion). Dallas is cancelling a major new road project through the center of the city. Louisville’s experience shows that tolling actually reduces traffic, and can obviate the need for new capacity. And Rochester is doing what some cities–like Portland–used to do: tearing out a freeway to rebuild urban neighborhoods.

Going, going . . . (Stantec, via CNU)

Must read

1. Here’s how to measure changes in rent. Streeteasy, a New York based real estate marketplace, has published a new report looking at changes in rents and incomes in New York City. Not surprisingly, rent increases since 2010 have generally outpaced incomes in the city. What’s really great about this report is the care with which the rental price indices have been assembled. While it’s common practice simply to report the median value of current listings (a technique that is easy, but can be biased by selection effects), Streeteasy has gone the extra mile and computed a repeat sales index that looks at changes in the prices of the same apartment at different points in time to estimate a quality-adjusted rate of change in rents. In addition, their estimates also incorporate the value of landlord concessions. Their methodology is clearly spelled out as well. This is a model for how to tackle the important task of measuring rent levels.

2. Why school segregation is like zoning. Mike Lewyn has a short but trenchant post that emphasizes an important dynamic that is at work when we rely on “local control.” Whether we’re talking about housing or schools (and the two are hardly separate), it’s to every neighborhood’s advantage to draw boundaries in a way that includes richer people and excludes poorer ones (and that’s true regardless of whether we’re talking about school attendance boundaries or zoning boundaries). When every neighborhood tries to pursue this strategy, it results in the systematic exclusion of the poor, and contributes to greater segregation (and higher housing costs) for everyone.

3. The minimum wage and automation. In a new paper, Economist David Neumark argues that one effect of a higher minimum wage will be to encourage employers to automate more tasks, displacing some low wage workers. Are we just hastening the rise of the robots by giving low wage workers raises? Bloomberg  View’s Noah Smith argues that this effect is a feature, not a bug, of minimum wage laws. Higher wages serve as an inducement to devise and deploy new labor saving technologies. That’s not a new phenomenon: The high price of labor in England was one of the reasons for the industrial revolution. And with good macroeconomic policies, its possible to grow the economy while getting rid of many of the most dirty, menial and physically demanding jobs.




The Week Observed, August 18, 2017

What City Observatory did this week

1. Hundred dollar bills on the municipal sidewalk? There’s a lot of interest in tapping the hidden value of municipal assets to address city financial problems. The typical city owns billions of dollars of assets, particularly valuable urban land, and at least in theory, if it could earn even a modest financial return on these assets, the city could raise substantial revenue. The problem is that getting more revenue ultimately hinges on charging users, often the citizen-beneficiaries of public services, something closer to the market price for the resources or services they receive. This raises political and policy problems.

2. Why we’re talking about Portland’s freeways. Long a leader in progressive transportation policy, having torn out one freeway and cancelled several others, Portland is now considering a billion dollar package of freeway-widening projects. We explain why the debate over these projects is of more than local interest, and why we’ll be following this debate in the weeks ahead at City Observatory.

3. Driven Apart:  How sprawl is lengthening American commutes. Our path-breaking 2010 report on the misleading nature of indices usually used to measure traffic congestion has been unavailable for some time.  We’re republishing it at City Observatory. It shows that the widely used travel time index conceals the additional time that sprawl adds to commuting trips. It turns out that more compact cities have shorter total travel times even though they may get poor grades according to the travel time index.

Must read

1. Mapping metro travel times. The Washington Post‘s Sahil Chinoy has a superb piece of data journalism showing the peak and off peak hour travel distances in US metro areas. Using data from’s archive of cell-phone based travel data, he estimates how far you can travel in one hour by car from the city center of each of the nation’s largest metro areas at 4pm, 7pm and 10pm on a given Friday afternoon. The results are all mapped on same scale, 75-mile radius maps, so you can easily compare metro areas. The following maps shows the areas reachable in New York (left) and Philadelphia (right) within one hour at 4pm (red), 7pm (yellow) and 10 pm (green) on a typical Friday evening.

2. Private bike share flourishing in Seattle.  Writing at Market Urbanism, Asher Meyers describes the initial success of two privately run bike sharing systems in Seattle, both of which use the newer “dockless” model, that relies on smart GPS equipped bikes. Limebike and Spin have each set up 1,000 bike fleets in the city, entering after the much publicized demise of the city-sponsored Pronto system. While there’s been some backlash about so-called “rogue” operators, and the illegal parking of some bikes, the entry of these privately funded systems seems likely to lead to even greater use of bikes as part of the urban transportation system.

3. America’s bus ridership is declining.  The Wall Street Journal reports that over the past decade, bus ridership in the US has declined by about 10 percent. There are many causes: chief among them, cutbacks in numbers of routes and hours of service. Strapped for funds, bus operators cut back, which further reduces the utility of the system and leads to lower ridership, creating a “transit death spiral.” Service reductions have disproportionate effects on the poor and carless households, who face longer, slower journeys and less access to jobs and services.

In the News

City Observatory’s Joe Cortright is quoted in last week’s Washington Post article on highway congestion.

So what’s behind nightmarish traffic? According to Cortright, it’s about zoning and segregation. The model of the modern American city with separate sections for living, working, shopping and eating, spurs congestion.

Willamette Week synopsizes our analysis of the “safety-washing” of proposed freeway expansion projects by the Oregon Department of Transportation.

But the focus of the transportation package is on big highway projects, which aren’t a leading cause of fatal crashes. Instead, state figures show,  a disproportionate number of deaths occur on secondary roads and city streets.